Date: (Sun) Jun 12, 2016

Introduction:

Data: Source: Training: “https://inclass.kaggle.com/c/can-we-predict-voting-outcomes/download/train2016.csv
New: “https://inclass.kaggle.com/c/can-we-predict-voting-outcomes/download/test2016.csv
Time period:

Synopsis:

Based on analysis utilizing <> techniques, :

Summary of key steps & error improvement stats:

Prediction Accuracy Enhancement Options:

  • transform.data chunk:
    • derive features from multiple features
  • manage.missing.data chunk:
    • Not fill missing vars
    • Fill missing numerics with a different algorithm
    • Fill missing chars with data based on clusters

[](.png)

Potential next steps include:

  • Organization:
    • Categorize by chunk
    • Priority criteria:
      1. Ease of change
      2. Impacts report
      3. Cleans innards
      4. Bug report
  • all chunks:
    • at chunk-end rm(!glb_)
  • manage.missing.data chunk:
    • cleaner way to manage re-splitting of training vs. new entity
  • extract.features chunk:
    • Add n-grams for glbFeatsText
      • “RTextTools”, “tau”, “RWeka”, and “textcat” packages
  • fit.models chunk:
    • Classification: Plot AUC Curves for all models & highlight glbMdlSel
    • Prediction accuracy scatter graph:
    • Add tiles (raw vs. PCA)
    • Use shiny for drop-down of “important” features
    • Use plot.ly for interactive plots ?

    • Change .fit suffix of model metrics to .mdl if it’s data independent (e.g. AIC, Adj.R.Squared - is it truly data independent ?, etc.)
    • create a custom model for rpart that has minbucket as a tuning parameter
    • varImp for randomForest crashes in caret version:6.0.41 -> submit bug report

  • Probability handling for multinomials vs. desired binomial outcome
  • ROCR currently supports only evaluation of binary classification tasks (version 1.0.7)
  • extensions toward multiclass classification are scheduled for the next release

  • fit.all.training chunk:
    • myplot_prediction_classification: displays ‘x’ instead of ‘+’ when there are no prediction errors
  • Compare glb_sel_mdl vs. glb_fin_mdl:
    • varImp
    • Prediction differences (shd be minimal ?)
  • Move glb_analytics_diag_plots to mydsutils.R: (+) Easier to debug (-) Too many glb vars used
  • Add print(ggplot.petrinet(glb_analytics_pn) + coord_flip()) at the end of every major chunk
  • Parameterize glb_analytics_pn
  • Move glb_impute_missing_data to mydsutils.R: (-) Too many glb vars used; glb_<>_df reassigned
  • Do non-glm methods handle interaction terms ?
  • f-score computation for classifiers should be summation across outcomes (not just the desired one ?)
  • Add accuracy computation to glb_dmy_mdl in predict.data.new chunk
  • Why does splitting fit.data.training.all chunk into separate chunks add an overhead of ~30 secs ? It’s not rbind b/c other chunks have lower elapsed time. Is it the number of plots ?
  • Incorporate code chunks in print_sessionInfo
  • Test against
    • projects in github.com/bdanalytics
    • lectures in jhu-datascience track

Analysis:

rm(list = ls())
set.seed(12345)
options(stringsAsFactors = FALSE)
source("~/Dropbox/datascience/R/mycaret.R")
source("~/Dropbox/datascience/R/mypetrinet.R")
source("~/Dropbox/datascience/R/myplclust.R")
source("~/Dropbox/datascience/R/myplot.R")
source("~/Dropbox/datascience/R/myscript.R")
source("~/Dropbox/datascience/R/mytm.R")
if (is.null(knitr::opts_current$get(name = 'label'))) # Running in IDE
    debugSource("~/Dropbox/datascience/R/mydsutils.R") else
    source("~/Dropbox/datascience/R/mydsutils.R")    
## Loading required package: caret
## Loading required package: lattice
# Gather all package requirements here
suppressPackageStartupMessages(require(doMC))
glbCores <- 10 # of cores on machine - 2
registerDoMC(glbCores) 

suppressPackageStartupMessages(require(caret))
require(plyr)
## Loading required package: plyr
require(dplyr)
## Loading required package: dplyr
## 
## Attaching package: 'dplyr'
## The following objects are masked from 'package:plyr':
## 
##     arrange, count, desc, failwith, id, mutate, rename, summarise,
##     summarize
## The following objects are masked from 'package:stats':
## 
##     filter, lag
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, setequal, union
require(knitr)
## Loading required package: knitr
require(stringr)
## Loading required package: stringr
#source("dbgcaret.R")
#packageVersion("snow")
#require(sos); findFn("cosine", maxPages=2, sortby="MaxScore")

# Analysis control global variables
# Inputs
#   url/name = "<PathPointer>"; if url specifies a zip file, name = "<filename>"; 
#               or named collection of <PathPointer>s
#   sep = choose from c(NULL, "\t")
glbObsTrnFile <- list(url = "https://inclass.kaggle.com/c/can-we-predict-voting-outcomes/download/train2016.csv"
    # or list(url = c(NULL, <.inp1> = "<path1>", <.inp2> = "<path2>"))
    #, splitSpecs = list(method = "copy" # default when glbObsNewFile is NULL
    #                       select from c("copy", NULL ???, "condition", "sample", )
    #                      ,nRatio = 0.3 # > 0 && < 1 if method == "sample" 
    #                      ,seed = 123 # any integer or glbObsTrnPartitionSeed if method == "sample" 
    #                      ,condition = # or 'is.na(<var>)'; '<var> <condition_operator> <value>'    
    #                      )
    )                   
 
glbObsNewFile <- list(url = "https://inclass.kaggle.com/c/can-we-predict-voting-outcomes/download/test2016.csv") 

glbObsDropCondition <- #NULL # : default
#   enclose in single-quotes b/c condition might include double qoutes
#       use | & ; NOT || &&    
#   '<condition>' 
    # 'grepl("^First Draft Video:", glbObsAll$Headline)'
    # 'is.na(glbObsAll[, glb_rsp_var_raw])'
    # '(is.na(glbObsAll[, glb_rsp_var_raw]) & grepl("Train", glbObsAll[, glbFeatsId]))'
    # 'is.na(strptime(glbObsAll[, "Date"], glbFeatsDateTime[["Date"]]["format"], tz = glbFeatsDateTime[["Date"]]["timezone"]))'
'(is.na(glbObsAll[, "Q109244"]) | (glbObsAll[, "Q109244"] != "No"))'
#nrow(do.call("subset",list(glbObsAll, parse(text=paste0("!(", glbObsDropCondition, ")")))))
    
glb_obs_repartition_train_condition <- NULL # : default
#    "<condition>" 

glb_max_fitobs <- NULL # or any integer
glbObsTrnPartitionSeed <- 123 # or any integer
                         
glb_is_regression <- FALSE; glb_is_classification <- !glb_is_regression; 
    glb_is_binomial <- TRUE # or TRUE or FALSE

glb_rsp_var_raw <- "Party"

# for classification, the response variable has to be a factor
glb_rsp_var <- "Party.fctr"

# if the response factor is based on numbers/logicals e.g (0/1 OR TRUE/FALSE vs. "A"/"B"), 
#   or contains spaces (e.g. "Not in Labor Force")
#   caret predict(..., type="prob") crashes
glb_map_rsp_raw_to_var <- #NULL 
function(raw) {
#     return(raw ^ 0.5)
#     return(log(raw))
#     return(log(1 + raw))
#     return(log10(raw)) 
#     return(exp(-raw / 2))
#     
# chk ref value against frequencies vs. alpha sort order
    ret_vals <- rep_len(NA, length(raw)); ret_vals[!is.na(raw)] <- ifelse(raw[!is.na(raw)] == "Republican", "R", "D"); return(relevel(as.factor(ret_vals), ref = "D")) 
    
#     as.factor(paste0("B", raw))
#     as.factor(gsub(" ", "\\.", raw))
    }

#if glb_rsp_var_raw is numeric:
#print(summary(glbObsAll[, glb_rsp_var_raw]))
#glb_map_rsp_raw_to_var(tst <- c(NA, as.numeric(summary(glbObsAll[, glb_rsp_var_raw])))) 

#if glb_rsp_var_raw is character:
#print(table(glbObsAll[, glb_rsp_var_raw], useNA = "ifany"))
# print(table(glb_map_rsp_raw_to_var(tst <- glbObsAll[, glb_rsp_var_raw]), useNA = "ifany"))

glb_map_rsp_var_to_raw <- #NULL 
function(var) {
#     return(var ^ 2.0)
#     return(exp(var))
#     return(10 ^ var) 
#     return(-log(var) * 2)
#     as.numeric(var)
#     levels(var)[as.numeric(var)]
    sapply(levels(var)[as.numeric(var)], function(elm) 
        if (is.na(elm)) return(elm) else
        if (elm == 'R') return("Republican") else
        if (elm == 'D') return("Democrat") else
        stop("glb_map_rsp_var_to_raw: unexpected value: ", elm)
        )  
#     gsub("\\.", " ", levels(var)[as.numeric(var)])
#     c("<=50K", " >50K")[as.numeric(var)]
#     c(FALSE, TRUE)[as.numeric(var)]
}
# print(table(glb_map_rsp_var_to_raw(glb_map_rsp_raw_to_var(tst)), useNA = "ifany"))

if ((glb_rsp_var != glb_rsp_var_raw) && is.null(glb_map_rsp_raw_to_var))
    stop("glb_map_rsp_raw_to_var function expected")

# List info gathered for various columns
# <col_name>:   <description>; <notes>
# USER_ID - an anonymous id unique to a given user
# YOB - the year of birth of the user
# Gender - the gender of the user, either Male or Female
# Income - the household income of the user. Either not provided, or one of "under $25,000", "$25,001 - $50,000", "$50,000 - $74,999", "$75,000 - $100,000", "$100,001 - $150,000", or "over $150,000".
# HouseholdStatus - the household status of the user. Either not provided, or one of "Domestic Partners (no kids)", "Domestic Partners (w/kids)", "Married (no kids)", "Married (w/kids)", "Single (no kids)", or "Single (w/kids)".
# EducationalLevel - the education level of the user. Either not provided, or one of "Current K-12", "High School Diploma", "Current Undergraduate", "Associate's Degree", "Bachelor's Degree", "Master's Degree", or "Doctoral Degree".
# Party - the political party for whom the user intends to vote for. Either "Democrat" or "Republican
# Q124742, Q124122, . . . , Q96024 - 101 different questions that the users were asked on Show of Hands. If the user didn't answer the question, there is a blank. For information about the question text and possible answers, see the file Questions.pdf.

# currently does not handle more than 1 column; consider concatenating multiple columns
# If glbFeatsId == NULL, ".rownames <- as.numeric(row.names())" is the default
glbFeatsId <- "USER_ID" # choose from c(NULL : default, "<id_feat>") 
# glbFeatsCategory <- "Hhold.fctr" # choose from c(NULL : default, "<category_feat>")
# glbFeatsCategory <- "Q109244.fctr" # choose from c(NULL : default, "<category_feat>") 
glbFeatsCategory <- "Q115611.fctr" # choose from c(NULL : default, "<category_feat>")

# User-specified exclusions
glbFeatsExclude <- c(NULL
#   Feats that shd be excluded due to known causation by prediction variable
# , "<feat1", "<feat2>"
#   Feats that are factors with unique values (as % of nObs) > 49 (empirically derived)
#   Feats that are linear combinations (alias in glm)
#   Feature-engineering phase -> start by excluding all features except id & category & 
#       work each one in
    , "USER_ID", "YOB", "Gender", "Income", "HouseholdStatus", "EducationLevel" 
    ,"Q124742","Q124122" 
    ,"Q123621","Q123464"
    ,"Q122771","Q122770","Q122769","Q122120"
    ,"Q121700","Q121699","Q121011"
    ,"Q120978","Q120650","Q120472","Q120379","Q120194","Q120014","Q120012" 
    ,"Q119851","Q119650","Q119334"
    ,"Q118892","Q118237","Q118233","Q118232","Q118117"
    ,"Q117193","Q117186"
    ,"Q116797","Q116881","Q116953","Q116601","Q116441","Q116448","Q116197"
    ,"Q115602","Q115777","Q115610","Q115611","Q115899","Q115390","Q115195"
    ,"Q114961","Q114748","Q114517","Q114386","Q114152"
    ,"Q113992","Q113583","Q113584","Q113181"
    ,"Q112478","Q112512","Q112270"
    ,"Q111848","Q111580","Q111220"
    ,"Q110740"
    ,"Q109367","Q109244"
    ,"Q108950","Q108855","Q108617","Q108856","Q108754","Q108342","Q108343"
    ,"Q107869","Q107491"
    ,"Q106993","Q106997","Q106272","Q106388","Q106389","Q106042"
    ,"Q105840","Q105655"
    ,"Q104996"
    ,"Q103293"
    ,"Q102906","Q102674","Q102687","Q102289","Q102089"
    ,"Q101162","Q101163","Q101596"
    ,"Q100689","Q100680","Q100562","Q100010"
    ,"Q99982"
    ,"Q99716"
    ,"Q99581"
    ,"Q99480"
    ,"Q98869"
    ,"Q98578"
    ,"Q98197"
    ,"Q98059","Q98078"
    ,"Q96024" # Done
    ,".pos") 
if (glb_rsp_var_raw != glb_rsp_var)
    glbFeatsExclude <- union(glbFeatsExclude, glb_rsp_var_raw)                    

glbFeatsInteractionOnly <- list()
#glbFeatsInteractionOnly[["<child_feat>"]] <- "<parent_feat>"
glbFeatsInteractionOnly[["YOB.Age.dff"]] <- "YOB.Age.fctr"

glbFeatsDrop <- c(NULL
                # , "<feat1>", "<feat2>"
                )

glb_map_vars <- NULL # or c("<var1>", "<var2>")
glb_map_urls <- list();
# glb_map_urls[["<var1>"]] <- "<var1.url>"

# Derived features; Use this mechanism to cleanse data ??? Cons: Data duplication ???
glbFeatsDerive <- list();

# glbFeatsDerive[["<feat.my.sfx>"]] <- list(
#     mapfn = function(<arg1>, <arg2>) { return(function(<arg1>, <arg2>)) } 
#   , args = c("<arg1>", "<arg2>"))
#myprint_df(data.frame(ImageId = mapfn(glbObsAll$.src, glbObsAll$.pos)))
#data.frame(ImageId = mapfn(glbObsAll$.src, glbObsAll$.pos))[7045:7055, ]

    # character
#     mapfn = function(Education) { raw <- Education; raw[is.na(raw)] <- "NA.my"; return(as.factor(raw)) } 
#     mapfn = function(Week) { return(substr(Week, 1, 10)) }
#     mapfn = function(Name) { return(sapply(Name, function(thsName) 
#                                             str_sub(unlist(str_split(thsName, ","))[1], 1, 1))) } 

#     mapfn = function(descriptor) { return(plyr::revalue(descriptor, c(
#         "ABANDONED BUILDING"  = "OTHER",
#         "**"                  = "**"
#                                           ))) }

#     mapfn = function(description) { mod_raw <- description;
    # This is here because it does not work if it's in txt_map_filename
#         mod_raw <- gsub(paste0(c("\n", "\211", "\235", "\317", "\333"), collapse = "|"), " ", mod_raw)
    # Don't parse for "." because of ".com"; use customized gsub for that text
#         mod_raw <- gsub("(\\w)(!|\\*|,|-|/)(\\w)", "\\1\\2 \\3", mod_raw);
    # Some state acrnoyms need context for separation e.g. 
    #   LA/L.A. could either be "Louisiana" or "LosAngeles"
        # modRaw <- gsub("\\bL\\.A\\.( |,|')", "LosAngeles\\1", modRaw);
    #   OK/O.K. could either be "Oklahoma" or "Okay"
#         modRaw <- gsub("\\bACA OK\\b", "ACA OKay", modRaw); 
#         modRaw <- gsub("\\bNow O\\.K\\.\\b", "Now OKay", modRaw);        
    #   PR/P.R. could either be "PuertoRico" or "Public Relations"        
        # modRaw <- gsub("\\bP\\.R\\. Campaign", "PublicRelations Campaign", modRaw);        
    #   VA/V.A. could either be "Virginia" or "VeteransAdministration"        
        # modRaw <- gsub("\\bthe V\\.A\\.\\:", "the VeteranAffairs:", modRaw);
    #   
    # Custom mods

#         return(mod_raw) }

    # numeric
# Create feature based on record position/id in data   
glbFeatsDerive[[".pos"]] <- list(
    mapfn = function(raw1) { return(1:length(raw1)) }
    , args = c(".rnorm"))
# glbFeatsDerive[[".pos.y"]] <- list(
#     mapfn = function(raw1) { return(1:length(raw1)) }       
#     , args = c(".rnorm"))    

# Add logs of numerics that are not distributed normally
#   Derive & keep multiple transformations of the same feature, if normality is hard to achieve with just one transformation
#   Right skew: logp1; sqrt; ^ 1/3; logp1(logp1); log10; exp(-<feat>/constant)
# glbFeatsDerive[["WordCount.log1p"]] <- list(
#     mapfn = function(WordCount) { return(log1p(WordCount)) } 
#   , args = c("WordCount"))
# glbFeatsDerive[["WordCount.root2"]] <- list(
#     mapfn = function(WordCount) { return(WordCount ^ (1/2)) } 
#   , args = c("WordCount"))
# glbFeatsDerive[["WordCount.nexp"]] <- list(
#     mapfn = function(WordCount) { return(exp(-WordCount)) } 
#   , args = c("WordCount"))
#print(summary(glbObsAll$WordCount))
#print(summary(mapfn(glbObsAll$WordCount)))
    
# If imputation shd be skipped for this feature
# glbFeatsDerive[["District.fctr"]] <- list(
#     mapfn = function(District) {
#         raw <- District;
#         ret_vals <- rep_len("NA", length(raw)); 
#         ret_vals[!is.na(raw)] <- sapply(raw[!is.na(raw)], function(elm) 
#                                         ifelse(elm < 10, "1-9", 
#                                         ifelse(elm < 20, "10-19", "20+")));
#         return(relevel(as.factor(ret_vals), ref = "NA"))
#     }       
#     , args = c("District"))    

# YOB options:
# 1. Missing data:
# 1.1   0 -> Does not improve baseline
# 1.2   Cut factors & "NA" is a level
# 2. Data corrections: < 1928 & > 2000
# 3. Scale YOB
# 4. Add Age
# YOB.Age.fctr needs to be synced with YOB.Age.dff; Create a separate sub-function ???
glbFeatsDerive[["YOB.Age.fctr"]] <- list(
    mapfn = function(raw1) {
        raw <- 2016 - raw1 
        # raw[!is.na(raw) & raw >= 2010] <- NA
        raw[!is.na(raw) & (raw <= 15)] <- NA
        raw[!is.na(raw) & (raw >= 90)] <- NA        
        retVal <- rep_len("NA", length(raw))
        # breaks = c(1879, seq(1949, 1989, 10), 2049)
        # cutVal <- cut(raw[!is.na(raw)], breaks = breaks, 
        #               labels = as.character(breaks + 1)[1:(length(breaks) - 1)])
        cutVal <- cut(raw[!is.na(raw)], breaks = c(15, 20, 25, 30, 35, 40, 50, 65, 90))
        retVal[!is.na(raw)] <- levels(cutVal)[cutVal]
        return(factor(retVal, levels = c("NA"
                ,"(15,20]","(20,25]","(25,30]","(30,35]","(35,40]","(40,50]","(50,65]","(65,90]"),
                        ordered = TRUE))
    }
    , args = c("YOB"))

# YOB.Age.fctr needs to be synced with YOB.Age.dff; Create a separate sub-function ???
glbFeatsDerive[["YOB.Age.dff"]] <- list(
    mapfn = function(raw1) {
        raw <- 2016 - raw1 
        raw[!is.na(raw) & (raw <= 15)] <- NA
        raw[!is.na(raw) & (raw >= 90)] <- NA        
        breaks <- c(15, 20, 25, 30, 35, 40, 50, 65, 90)

        # retVal <- rep_len(0, length(raw))
        stopifnot(sum(!is.na(raw) && (raw <= 15)) == 0)
        stopifnot(sum(!is.na(raw) && (raw >= 90)) == 0) 
        # msk <- !is.na(raw) && (raw > 15) && (raw <= 20); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 15
        # msk <- !is.na(raw) && (raw > 20) && (raw <= 25); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 20
        # msk <- !is.na(raw) && (raw > 25) && (raw <= 30); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 25
        # msk <- !is.na(raw) && (raw > 30) && (raw <= 35); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 30
        # msk <- !is.na(raw) && (raw > 35) && (raw <= 40); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 35
        # msk <- !is.na(raw) && (raw > 40) && (raw <= 50); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 40
        # msk <- !is.na(raw) && (raw > 50) && (raw <= 65); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 50
        # msk <- !is.na(raw) && (raw > 65) && (raw <= 90); if (sum(msk > 0)) retVal[msk] <- raw[msk] - 65

        breaks <- c(15, 20, 25, 30, 35, 40, 50, 65, 90)        
        retVal <- sapply(raw, function(age) {
            if (is.na(age)) return(0) else
            if ((age > 15) && (age <= 20)) return(age - 15) else
            if ((age > 20) && (age <= 25)) return(age - 20) else
            if ((age > 25) && (age <= 30)) return(age - 25) else
            if ((age > 30) && (age <= 35)) return(age - 30) else
            if ((age > 35) && (age <= 40)) return(age - 35) else
            if ((age > 40) && (age <= 50)) return(age - 40) else
            if ((age > 50) && (age <= 65)) return(age - 50) else
            if ((age > 65) && (age <= 90)) return(age - 65)
        })
        
        return(retVal)
    }
    , args = c("YOB"))

glbFeatsDerive[["Gender.fctr"]] <- list(
    mapfn = function(raw1) {
        raw <- raw1
        raw[raw %in% ""] <- "N"
        raw <- gsub("Male"  , "M", raw, fixed = TRUE)
        raw <- gsub("Female", "F", raw, fixed = TRUE)        
        return(relevel(as.factor(raw), ref = "N"))
    }
    , args = c("Gender"))

glbFeatsDerive[["Income.fctr"]] <- list(
    mapfn = function(raw1) { raw <- raw1;
        raw[raw %in% ""] <- "N"
        raw <- gsub("under $25,000"      , "<25K"    , raw, fixed = TRUE)
        raw <- gsub("$25,001 - $50,000"  , "25-50K"  , raw, fixed = TRUE)
        raw <- gsub("$50,000 - $74,999"  , "50-75K"  , raw, fixed = TRUE)
        raw <- gsub("$75,000 - $100,000" , "75-100K" , raw, fixed = TRUE)        
        raw <- gsub("$100,001 - $150,000", "100-150K", raw, fixed = TRUE)
        raw <- gsub("over $150,000"      , ">150K"   , raw, fixed = TRUE)        
        return(factor(raw, levels = c("N","<25K","25-50K","50-75K","75-100K","100-150K",">150K"),
                      ordered = TRUE))
    }
    , args = c("Income"))

glbFeatsDerive[["Hhold.fctr"]] <- list(
    mapfn = function(raw1) { raw <- raw1;
        raw[raw %in% ""] <- "N"
        raw <- gsub("Domestic Partners (no kids)", "PKn", raw, fixed = TRUE)
        raw <- gsub("Domestic Partners (w/kids)" , "PKy", raw, fixed = TRUE)        
        raw <- gsub("Married (no kids)"          , "MKn", raw, fixed = TRUE)
        raw <- gsub("Married (w/kids)"           , "MKy", raw, fixed = TRUE)        
        raw <- gsub("Single (no kids)"           , "SKn", raw, fixed = TRUE)
        raw <- gsub("Single (w/kids)"            , "SKy", raw, fixed = TRUE)        
        return(relevel(as.factor(raw), ref = "N"))
    }
    , args = c("HouseholdStatus"))

glbFeatsDerive[["Edn.fctr"]] <- list(
    mapfn = function(raw1) { raw <- raw1;
        raw[raw %in% ""] <- "N"
        raw <- gsub("Current K-12"         , "K12", raw, fixed = TRUE)
        raw <- gsub("High School Diploma"  , "HSD", raw, fixed = TRUE)        
        raw <- gsub("Current Undergraduate", "CCg", raw, fixed = TRUE)
        raw <- gsub("Associate's Degree"   , "Ast", raw, fixed = TRUE)
        raw <- gsub("Bachelor's Degree"    , "Bcr", raw, fixed = TRUE)        
        raw <- gsub("Master's Degree"      , "Msr", raw, fixed = TRUE)
        raw <- gsub("Doctoral Degree"      , "PhD", raw, fixed = TRUE)        
        return(factor(raw, levels = c("N","K12","HSD","CCg","Ast","Bcr","Msr","PhD"),
                      ordered = TRUE))
    }
    , args = c("EducationLevel"))

# for (qsn in c("Q124742","Q124122"))
# for (qsn in grep("Q12(.{4})(?!\\.fctr)", names(glbObsTrn), value = TRUE, perl = TRUE))
for (qsn in grep("Q", glbFeatsExclude, fixed = TRUE, value = TRUE))    
    glbFeatsDerive[[paste0(qsn, ".fctr")]] <- list(
        mapfn = function(raw1) {
            raw1[raw1 %in% ""] <- "NA"
            rawVal <- unique(raw1)
            
            if (length(setdiff(rawVal, (expVal <- c("NA", "No", "Ys")))) == 0) {
                raw1 <- gsub("Yes", "Ys", raw1, fixed = TRUE)
                if (length(setdiff(rawVal, expVal)) > 0)
                    stop(qsn, " vals: ", paste0(rawVal, collapse = "|"), 
                         " does not match expectation: ", paste0(expVal, collapse = "|"))
            } else
            if (length(setdiff(rawVal, (expVal <- c("NA", "Me", "Circumstances")))) == 0) {
                raw1 <- gsub("Circumstances", "Cs", raw1, fixed = TRUE)
                if (length(setdiff(rawVal, expVal)) > 0)
                    stop(qsn, " vals: ", paste0(rawVal, collapse = "|"), 
                         " does not match expectation: ", paste0(expVal, collapse = "|"))
            } else
            if (length(setdiff(rawVal, (expVal <- c("NA", "Grrr people", "Yay people!")))) == 0) {
                raw1 <- gsub("Grrr people", "Gr", raw1, fixed = TRUE)
                raw1 <- gsub("Yay people!", "Yy", raw1, fixed = TRUE)                
                if (length(setdiff(rawVal, expVal)) > 0)
                    stop(qsn, " vals: ", paste0(rawVal, collapse = "|"), 
                         " does not match expectation: ", paste0(expVal, collapse = "|"))
            } else
            if (length(setdiff(rawVal, (expVal <- c("NA", "Idealist", "Pragmatist")))) == 0) {
                raw1 <- gsub("Idealist"  , "Id", raw1, fixed = TRUE)
                raw1 <- gsub("Pragmatist", "Pr", raw1, fixed = TRUE)                
                if (length(setdiff(rawVal, expVal)) > 0)
                    stop(qsn, " vals: ", paste0(rawVal, collapse = "|"), 
                         " does not match expectation: ", paste0(expVal, collapse = "|"))
            } else
            if (length(setdiff(rawVal, (expVal <- c("NA", "Private", "Public")))) == 0) {
                raw1 <- gsub("Private", "Pt", raw1, fixed = TRUE)
                raw1 <- gsub("Public" , "Pc", raw1, fixed = TRUE)                
                if (length(setdiff(rawVal, expVal)) > 0)
                    stop(qsn, " vals: ", paste0(rawVal, collapse = "|"), 
                         " does not match expectation: ", paste0(expVal, collapse = "|"))
            }
            
            return(relevel(as.factor(raw1), ref = "NA"))
        }
        , args = c(qsn))

# If imputation of missing data is not working ...
# glbFeatsDerive[["FertilityRate.nonNA"]] <- list(
#     mapfn = function(FertilityRate, Region) {
#         RegionMdn <- tapply(FertilityRate, Region, FUN = median, na.rm = TRUE)
# 
#         retVal <- FertilityRate
#         retVal[is.na(FertilityRate)] <- RegionMdn[Region[is.na(FertilityRate)]]
#         return(retVal)
#     }
#     , args = c("FertilityRate", "Region"))
    
#     mapfn = function(HOSPI.COST) { return(cut(HOSPI.COST, 5, breaks = c(0, 100000, 200000, 300000, 900000), labels = NULL)) }     
#     mapfn = function(Rasmussen)  { return(ifelse(sign(Rasmussen) >= 0, 1, 0)) } 
#     mapfn = function(startprice) { return(startprice ^ (1/2)) }       
#     mapfn = function(startprice) { return(log(startprice)) }   
#     mapfn = function(startprice) { return(exp(-startprice / 20)) }
#     mapfn = function(startprice) { return(scale(log(startprice))) }     
#     mapfn = function(startprice) { return(sign(sprice.predict.diff) * (abs(sprice.predict.diff) ^ (1/10))) }        

    # factor      
#     mapfn = function(PropR) { return(as.factor(ifelse(PropR >= 0.5, "Y", "N"))) }
#     mapfn = function(productline, description) { as.factor(gsub(" ", "", productline)) }
#     mapfn = function(purpose) { return(relevel(as.factor(purpose), ref="all_other")) }
#     mapfn = function(raw) { tfr_raw <- as.character(cut(raw, 5)); 
#                             tfr_raw[is.na(tfr_raw)] <- "NA.my";
#                             return(as.factor(tfr_raw)) }
#     mapfn = function(startprice.log10) { return(cut(startprice.log10, 3)) }
#     mapfn = function(startprice.log10) { return(cut(sprice.predict.diff, c(-1000, -100, -10, -1, 0, 1, 10, 100, 1000))) }    

#     , args = c("<arg1>"))
    
    # multiple args
#     mapfn = function(id, date) { return(paste(as.character(id), as.character(date), sep = "#")) }        
#     mapfn = function(PTS, oppPTS) { return(PTS - oppPTS) }
#     mapfn = function(startprice.log10.predict, startprice) {
#                  return(spdiff <- (10 ^ startprice.log10.predict) - startprice) } 
#     mapfn = function(productline, description) { as.factor(
#         paste(gsub(" ", "", productline), as.numeric(nchar(description) > 0), sep = "*")) }
#     mapfn = function(.src, .pos) { 
#         return(paste(.src, sprintf("%04d", 
#                                    ifelse(.src == "Train", .pos, .pos - 7049)
#                                    ), sep = "#")) }       

# # If glbObsAll is not sorted in the desired manner
#     mapfn=function(Week) { return(coredata(lag(zoo(orderBy(~Week, glbObsAll)$ILI), -2, na.pad=TRUE))) }
#     mapfn=function(ILI) { return(coredata(lag(zoo(ILI), -2, na.pad=TRUE))) }
#     mapfn=function(ILI.2.lag) { return(log(ILI.2.lag)) }

# glbFeatsDerive[["<var1>"]] <- glbFeatsDerive[["<var2>"]]

# tst <- "descr.my"; args_lst <- NULL; for (arg in glbFeatsDerive[[tst]]$args) args_lst[[arg]] <- glbObsAll[, arg]; print(head(args_lst[[arg]])); print(head(drv_vals <- do.call(glbFeatsDerive[[tst]]$mapfn, args_lst))); 
# print(which_ix <- which(args_lst[[arg]] == 0.75)); print(drv_vals[which_ix]); 

glbFeatsDateTime <- list()
# Use OlsonNames() to enumerate supported time zones
# glbFeatsDateTime[["<DateTimeFeat>"]] <- 
#     c(format = "%Y-%m-%d %H:%M:%S" or "%m/%e/%y", timezone = "US/Eastern", impute.na = TRUE, 
#       last.ctg = FALSE, poly.ctg = FALSE)

glbFeatsPrice <- NULL # or c("<price_var>")

glbFeatsImage <- list() #list(<imageFeat> = list(patchSize = 10)) # if patchSize not specified, no patch computation

glbFeatsText <- list()
Sys.setlocale("LC_ALL", "C") # For english
## [1] "C/C/C/C/C/en_US.UTF-8"
#glbFeatsText[["<TextFeature>"]] <- list(NULL,
#   ,names = myreplacePunctuation(str_to_lower(gsub(" ", "", c(NULL, 
#       <comma-separated-screened-names>
#   ))))
#   ,rareWords = myreplacePunctuation(str_to_lower(gsub(" ", "", c(NULL, 
#       <comma-separated-nonSCOWL-words>
#   ))))
#)

# Text Processing Step: custom modifications not present in txt_munge -> use glbFeatsDerive
# Text Processing Step: universal modifications
glb_txt_munge_filenames_pfx <- "<projectId>_mytxt_"

# Text Processing Step: tolower
# Text Processing Step: myreplacePunctuation
# Text Processing Step: removeWords
glb_txt_stop_words <- list()
# Remember to use unstemmed words
if (length(glbFeatsText) > 0) {
    require(tm)
    require(stringr)

    glb_txt_stop_words[["<txt_var>"]] <- sort(myreplacePunctuation(str_to_lower(gsub(" ", "", c(NULL
        # Remove any words from stopwords            
#         , setdiff(myreplacePunctuation(stopwords("english")), c("<keep_wrd1>", <keep_wrd2>"))
                                
        # Remove salutations
        ,"mr","mrs","dr","Rev"                                

        # Remove misc
        #,"th" # Happy [[:digit::]]+th birthday 

        # Remove terms present in Trn only or New only; search for "Partition post-stem"
        #   ,<comma-separated-terms>        

        # cor.y.train == NA
#         ,unlist(strsplit(paste(c(NULL
#           ,"<comma-separated-terms>"
#         ), collapse=",")

        # freq == 1; keep c("<comma-separated-terms-to-keep>")
            # ,<comma-separated-terms>

        # chisq.pval high (e.g. == 1); keep c("<comma-separated-terms-to-keep>")
            # ,<comma-separated-terms>

        # nzv.freqRatio high (e.g. >= glbFeatsNzvFreqMax); keep c("<comma-separated-terms-to-keep>")
            # ,<comma-separated-terms>        
                                            )))))
}
#orderBy(~term, glb_post_stem_words_terms_df_lst[[txtFeat]][grep("^man", glb_post_stem_words_terms_df_lst[[txtFeat]]$term), ])
#glbObsAll[glb_post_stem_words_terms_mtrx_lst[[txtFeat]][, 4866] > 0, c(glb_rsp_var, txtFeat)]

# To identify terms with a specific freq
#paste0(sort(subset(glb_post_stop_words_terms_df_lst[[txtFeat]], freq == 1)$term), collapse = ",")
#paste0(sort(subset(glb_post_stem_words_terms_df_lst[[txtFeat]], freq <= 2)$term), collapse = ",")
#subset(glb_post_stem_words_terms_df_lst[[txtFeat]], term %in% c("zinger"))

# To identify terms with a specific freq & 
#   are not stemmed together later OR is value of color.fctr (e.g. gold)
#paste0(sort(subset(glb_post_stop_words_terms_df_lst[[txtFeat]], (freq == 1) & !(term %in% c("blacked","blemish","blocked","blocks","buying","cables","careful","carefully","changed","changing","chargers","cleanly","cleared","connect","connects","connected","contains","cosmetics","default","defaulting","defective","definitely","describe","described","devices","displays","drop","drops","engravement","excellant","excellently","feels","fix","flawlessly","frame","framing","gentle","gold","guarantee","guarantees","handled","handling","having","install","iphone","iphones","keeped","keeps","known","lights","line","lining","liquid","liquidation","looking","lots","manuals","manufacture","minis","most","mostly","network","networks","noted","opening","operated","performance","performs","person","personalized","photograph","physically","placed","places","powering","pre","previously","products","protection","purchasing","returned","rotate","rotation","running","sales","second","seconds","shipped","shuts","sides","skin","skinned","sticker","storing","thats","theres","touching","unusable","update","updates","upgrade","weeks","wrapped","verified","verify") ))$term), collapse = ",")

#print(subset(glb_post_stem_words_terms_df_lst[[txtFeat]], (freq <= 2)))
#glbObsAll[which(terms_mtrx[, 229] > 0), glbFeatsText]

# To identify terms with cor.y == NA
#orderBy(~-freq+term, subset(glb_post_stop_words_terms_df_lst[[txtFeat]], is.na(cor.y)))
#paste(sort(subset(glb_post_stop_words_terms_df_lst[[txtFeat]], is.na(cor.y))[, "term"]), collapse=",")
#orderBy(~-freq+term, subset(glb_post_stem_words_terms_df_lst[[txtFeat]], is.na(cor.y)))

# To identify terms with low cor.y.abs
#head(orderBy(~cor.y.abs+freq+term, subset(glb_post_stem_words_terms_df_lst[[txtFeat]], !is.na(cor.y))), 5)

# To identify terms with high chisq.pval
#subset(glb_post_stem_words_terms_df_lst[[txtFeat]], chisq.pval > 0.99)
#paste0(sort(subset(glb_post_stem_words_terms_df_lst[[txtFeat]], (chisq.pval > 0.99) & (freq <= 10))$term), collapse=",")
#paste0(sort(subset(glb_post_stem_words_terms_df_lst[[txtFeat]], (chisq.pval > 0.9))$term), collapse=",")
#head(orderBy(~-chisq.pval+freq+term, glb_post_stem_words_terms_df_lst[[txtFeat]]), 5)
#glbObsAll[glb_post_stem_words_terms_mtrx_lst[[txtFeat]][, 68] > 0, glbFeatsText]
#orderBy(~term, glb_post_stem_words_terms_df_lst[[txtFeat]][grep("^m", glb_post_stem_words_terms_df_lst[[txtFeat]]$term), ])

# To identify terms with high nzv.freqRatio
#summary(glb_post_stem_words_terms_df_lst[[txtFeat]]$nzv.freqRatio)
#paste0(sort(setdiff(subset(glb_post_stem_words_terms_df_lst[[txtFeat]], (nzv.freqRatio >= glbFeatsNzvFreqMax) & (freq < 10) & (chisq.pval >= 0.05))$term, c( "128gb","3g","4g","gold","ipad1","ipad3","ipad4","ipadair2","ipadmini2","manufactur","spacegray","sprint","tmobil","verizon","wifion"))), collapse=",")

# To identify obs with a txt term
#tail(orderBy(~-freq+term, glb_post_stop_words_terms_df_lst[[txtFeat]]), 20)
#mydspObs(list(descr.my.contains="non"), cols=c("color", "carrier", "cellular", "storage"))
#grep("ever", dimnames(terms_stop_mtrx)$Terms)
#which(terms_stop_mtrx[, grep("ipad", dimnames(terms_stop_mtrx)$Terms)] > 0)
#glbObsAll[which(terms_stop_mtrx[, grep("16", dimnames(terms_stop_mtrx)$Terms)[1]] > 0), c(glbFeatsCategory, "storage", txtFeat)]

# Text Processing Step: screen for names # Move to glbFeatsText specs section in order of text processing steps
# glbFeatsText[["<txtFeat>"]]$names <- myreplacePunctuation(str_to_lower(gsub(" ", "", c(NULL
#         # Person names for names screening
#         ,<comma-separated-list>
#         
#         # Company names
#         ,<comma-separated-list>
#                     
#         # Product names
#         ,<comma-separated-list>
#     ))))

# glbFeatsText[["<txtFeat>"]]$rareWords <- myreplacePunctuation(str_to_lower(gsub(" ", "", c(NULL
#         # Words not in SCOWL db
#         ,<comma-separated-list>
#     ))))

# To identify char vectors post glbFeatsTextMap
#grep("six(.*)hour", glb_txt_chr_lst[[txtFeat]], ignore.case = TRUE, value = TRUE)
#grep("[S|s]ix(.*)[H|h]our", glb_txt_chr_lst[[txtFeat]], value = TRUE)

# To identify whether terms shd be synonyms
#orderBy(~term, glb_post_stop_words_terms_df_lst[[txtFeat]][grep("^moder", glb_post_stop_words_terms_df_lst[[txtFeat]]$term), ])
# term_row_df <- glb_post_stop_words_terms_df_lst[[txtFeat]][grep("^came$", glb_post_stop_words_terms_df_lst[[txtFeat]]$term), ]
# 
# cor(glb_post_stop_words_terms_mtrx_lst[[txtFeat]][glbObsAll$.lcn == "Fit", term_row_df$pos], glbObsTrn[, glb_rsp_var], use="pairwise.complete.obs")

# To identify which stopped words are "close" to a txt term
#sort(glbFeatsCluster)

# Text Processing Step: stemDocument
# To identify stemmed txt terms
#glb_post_stop_words_terms_df_lst[[txtFeat]][grep("^la$", glb_post_stop_words_terms_df_lst[[txtFeat]]$term), ]
#orderBy(~term, glb_post_stem_words_terms_df_lst[[txtFeat]][grep("^con", glb_post_stem_words_terms_df_lst[[txtFeat]]$term), ])
#glbObsAll[which(terms_stem_mtrx[, grep("use", dimnames(terms_stem_mtrx)$Terms)[[1]]] > 0), c(glbFeatsId, "productline", txtFeat)]
#glbObsAll[which(TfIdf_stem_mtrx[, 191] > 0), c(glbFeatsId, glbFeatsCategory, txtFeat)]
#glbObsAll[which(glb_post_stop_words_terms_mtrx_lst[[txtFeat]][, 6165] > 0), c(glbFeatsId, glbFeatsCategory, txtFeat)]
#which(glbObsAll$UniqueID %in% c(11915, 11926, 12198))

# Text Processing Step: mycombineSynonyms
#   To identify which terms are associated with not -> combine "could not" & "couldn't"
#findAssocs(glb_full_DTM_lst[[txtFeat]], "not", 0.05)
#   To identify which synonyms should be combined
#orderBy(~term, glb_post_stem_words_terms_df_lst[[txtFeat]][grep("^c", glb_post_stem_words_terms_df_lst[[txtFeat]]$term), ])
chk_comb_cor <- function(syn_lst) {
#     cor(terms_stem_mtrx[glbObsAll$.src == "Train", grep("^(damag|dent|ding)$", dimnames(terms_stem_mtrx)[[2]])], glbObsTrn[, glb_rsp_var], use="pairwise.complete.obs")
    print(subset(glb_post_stem_words_terms_df_lst[[txtFeat]], term %in% syn_lst$syns))
    print(subset(get_corpus_terms(tm_map(glbFeatsTextCorpus[[txtFeat]], mycombineSynonyms, list(syn_lst), lazy=FALSE)), term == syn_lst$word))
#     cor(terms_stop_mtrx[glbObsAll$.src == "Train", grep("^(damage|dent|ding)$", dimnames(terms_stop_mtrx)[[2]])], glbObsTrn[, glb_rsp_var], use="pairwise.complete.obs")
#     cor(rowSums(terms_stop_mtrx[glbObsAll$.src == "Train", grep("^(damage|dent|ding)$", dimnames(terms_stop_mtrx)[[2]])]), glbObsTrn[, glb_rsp_var], use="pairwise.complete.obs")
}
#chk_comb_cor(syn_lst=list(word="cabl",  syns=c("cabl", "cord")))
#chk_comb_cor(syn_lst=list(word="damag",  syns=c("damag", "dent", "ding")))
#chk_comb_cor(syn_lst=list(word="dent",  syns=c("dent", "ding")))
#chk_comb_cor(syn_lst=list(word="use",  syns=c("use", "usag")))

glbFeatsTextSynonyms <- list()
# list parsed to collect glbFeatsText[[<txtFeat>]]$vldTerms
# glbFeatsTextSynonyms[["Hdln.my"]] <- list(NULL
#     # people in places
#     , list(word = "australia", syns = c("australia", "australian"))
#     , list(word = "italy", syns = c("italy", "Italian"))
#     , list(word = "newyork", syns = c("newyork", "newyorker"))    
#     , list(word = "Pakistan", syns = c("Pakistan", "Pakistani"))    
#     , list(word = "peru", syns = c("peru", "peruvian"))
#     , list(word = "qatar", syns = c("qatar", "qatari"))
#     , list(word = "scotland", syns = c("scotland", "scotish"))
#     , list(word = "Shanghai", syns = c("Shanghai", "Shanzhai"))    
#     , list(word = "venezuela", syns = c("venezuela", "venezuelan"))    
# 
#     # companies - needs to be data dependent 
#     #   - e.g. ensure BNP in this experiment/feat always refers to BNPParibas
#         
#     # general synonyms
#     , list(word = "Create", syns = c("Create","Creator")) 
#     , list(word = "cute", syns = c("cute","cutest"))     
#     , list(word = "Disappear", syns = c("Disappear","Fadeout"))     
#     , list(word = "teach", syns = c("teach", "taught"))     
#     , list(word = "theater",  syns = c("theater", "theatre", "theatres")) 
#     , list(word = "understand",  syns = c("understand", "understood"))    
#     , list(word = "weak",  syns = c("weak", "weaken", "weaker", "weakest"))
#     , list(word = "wealth",  syns = c("wealth", "wealthi"))    
#     
#     # custom synonyms (phrases)
#     
#     # custom synonyms (names)
#                                       )
#glbFeatsTextSynonyms[["<txtFeat>"]] <- list(NULL
#     , list(word="<stem1>",  syns=c("<stem1>", "<stem1_2>"))
#                                       )

for (txtFeat in names(glbFeatsTextSynonyms))
    for (entryIx in 1:length(glbFeatsTextSynonyms[[txtFeat]])) {
        glbFeatsTextSynonyms[[txtFeat]][[entryIx]]$word <-
            str_to_lower(glbFeatsTextSynonyms[[txtFeat]][[entryIx]]$word)
        glbFeatsTextSynonyms[[txtFeat]][[entryIx]]$syns <-
            str_to_lower(glbFeatsTextSynonyms[[txtFeat]][[entryIx]]$syns)        
    }        

glbFeatsTextSeed <- 181
# tm options include: check tm::weightSMART 
glb_txt_terms_control <- list( # Gather model performance & run-time stats
                    # weighting = function(x) weightSMART(x, spec = "nnn")
                    # weighting = function(x) weightSMART(x, spec = "lnn")
                    # weighting = function(x) weightSMART(x, spec = "ann")
                    # weighting = function(x) weightSMART(x, spec = "bnn")
                    # weighting = function(x) weightSMART(x, spec = "Lnn")
                    # 
                    weighting = function(x) weightSMART(x, spec = "ltn") # default
                    # weighting = function(x) weightSMART(x, spec = "lpn")                    
                    # 
                    # weighting = function(x) weightSMART(x, spec = "ltc")                    
                    # 
                    # weighting = weightBin 
                    # weighting = weightTf 
                    # weighting = weightTfIdf # : default
                # termFreq selection criteria across obs: tm default: list(global=c(1, Inf))
                    , bounds = list(global = c(1, Inf)) 
                # wordLengths selection criteria: tm default: c(3, Inf)
                    , wordLengths = c(1, Inf) 
                              ) 

glb_txt_cor_var <- glb_rsp_var # : default # or c(<feat>)

# select one from c("union.top.val.cor", "top.cor", "top.val", default: "top.chisq", "sparse")
glbFeatsTextFilter <- "top.chisq" 
glbFeatsTextTermsMax <- rep(10, length(glbFeatsText)) # :default
names(glbFeatsTextTermsMax) <- names(glbFeatsText)

# Text Processing Step: extractAssoc
glbFeatsTextAssocCor <- rep(1, length(glbFeatsText)) # :default 
names(glbFeatsTextAssocCor) <- names(glbFeatsText)

# Remember to use stemmed terms
glb_important_terms <- list()

# Text Processing Step: extractPatterns (ngrams)
glbFeatsTextPatterns <- list()
#glbFeatsTextPatterns[[<txtFeat>>]] <- list()
#glbFeatsTextPatterns[[<txtFeat>>]] <- c(metropolitan.diary.colon = "Metropolitan Diary:")

# Have to set it even if it is not used
# Properties:
#   numrows(glb_feats_df) << numrows(glbObsFit
#   Select terms that appear in at least 0.2 * O(FP/FN(glbObsOOB)) ???
#       numrows(glbObsOOB) = 1.1 * numrows(glbObsNew) ???
glb_sprs_thresholds <- NULL # or c(<txtFeat1> = 0.988, <txtFeat2> = 0.970, <txtFeat3> = 0.970)

glbFctrMaxUniqVals <- 20 # default: 20
glb_impute_na_data <- FALSE # or TRUE
glb_mice_complete.seed <- 144 # or any integer

glbFeatsCluster <- paste(grep("^Q.", glbFeatsExclude, value = TRUE), "fctr", sep = ".") # NULL : glbFeatsCluster <- c("YOB.Age.fctr", "Gender.fctr", "Income.fctr", 
                     # # "Hhold.fctr",
                     # "Edn.fctr",
                     # paste(grep("^Q.", glbFeatsExclude, value = TRUE), "fctr", sep = ".")) # NULL : default or c("<feat1>", "<feat2>")
# glbFeatsCluster <- grep(paste0("[", 
#                         toupper(paste0(substr(glbFeatsText, 1, 1), collapse = "")),
#                                       "]\\.[PT]\\."), 
#                                names(glbObsAll), value = TRUE)

glb_cluster.seed <- 189 # or any integer
glbClusterEntropyVar <- NULL # c(glb_rsp_var, as.factor(cut(glb_rsp_var, 3)), default: NULL)
glbFeatsClusterVarsExclude <- FALSE # default FALSE

glb_interaction_only_feats <- NULL # : default or c(<parent_feat> = "<child_feat>")

glbFeatsNzvFreqMax <- 19 # 19 : caret default
glbFeatsNzvUniqMin <- 10 # 10 : caret default

glbRFESizes <- list()
#glbRFESizes[["mdlFamily"]] <- c(4, 8, 16, 32, 64, 67, 68, 69) # Accuracy@69/70 = 0.8258
# glbRFESizes[["RFE.X"]] <- c(96, 112, 120, 124, 128, 129, 130, 131, 132, 133, 135, 138, 142, 157, 187, 247) # accuracy(131) = 0.6285
# glbRFESizes[["Final"]] <- c(8, 16, 32, 40, 44, 46, 48, 49, 50, 51, 52, 56, 64, 96, 128, 247) # accuracy(49) = 0.6164

glbRFEResults <- NULL

glbObsFitOutliers <- list()
# If outliers.n >= 10; consider concatenation of interaction vars
# glbObsFitOutliers[["<mdlFamily>"]] <- c(NULL
#     is.na(.rstudent)
#     max(.rstudent)
#     is.na(.dffits)
#     .hatvalues >= 0.99        
#     -38,167,642 < minmax(.rstudent) < 49,649,823    
#     , <comma-separated-<glbFeatsId>>
#                                     )
glbObsTrnOutliers <- list()
glbObsTrnOutliers[["Final"]] <- union(glbObsFitOutliers[["All.X"]],
                                c(NULL
                                ))

# Modify mdlId to (build & extract) "<FamilyId>#<Fit|Trn>#<caretMethod>#<preProc1.preProc2>#<samplingMethod>"
glb_models_lst <- list(); glb_models_df <- data.frame()

# Add xgboost algorithm

# Regression
if (glb_is_regression) {
    glbMdlMethods <- c(NULL
        # deterministic
            #, "lm", # same as glm
            , "glm", "bayesglm", "glmnet"
            , "rpart"
        # non-deterministic
            , "gbm", "rf" 
        # Unknown
            , "nnet" , "avNNet" # runs 25 models per cv sample for tunelength=5
            , "svmLinear", "svmLinear2"
            , "svmPoly" # runs 75 models per cv sample for tunelength=5
            , "svmRadial" 
            , "earth"
            , "bagEarth" # Takes a long time
            ,"xgbLinear","xgbTree"
        )
} else
# Classification - Add ada (auto feature selection)
    if (glb_is_binomial)
        glbMdlMethods <- c(NULL
        # deterministic                     
            , "bagEarth" # Takes a long time        
            , "glm", "bayesglm", "glmnet"
            , "nnet"
            , "rpart"
        # non-deterministic        
            , "gbm"
            , "avNNet" # runs 25 models per cv sample for tunelength=5      
            , "rf"
        # Unknown
            , "lda", "lda2"
                # svm models crash when predict is called -> internal to kernlab it should call predict without .outcome
            , "svmLinear", "svmLinear2"
            , "svmPoly" # runs 75 models per cv sample for tunelength=5
            , "svmRadial" 
            , "earth"
            ,"xgbLinear","xgbTree"
        ) else
        glbMdlMethods <- c(NULL
        # deterministic
            ,"glmnet"
        # non-deterministic 
            ,"rf"       
        # Unknown
            ,"gbm","rpart","xgbLinear","xgbTree"
        )

glbMdlFamilies <- list(); glb_mdl_feats_lst <- list()
# family: Choose from c("RFE.X", "Csm.X", "All.X", "Best.Interact") %*% c(NUll, ".NOr", ".Inc")
#   RFE = "Recursive Feature Elimination"
#   Csm = CuStoM
#   NOr = No OutlieRs
#   Inc = INteraCt
#   methods: Choose from c(NULL, <method>, glbMdlMethods) 
#glbMdlFamilies[["RFE.X"]] <- c("glmnet", "glm") # non-NULL vector is mandatory
if (glb_is_classification && !glb_is_binomial) {
    # glm does not work for multinomial
    glbMdlFamilies[["All.X"]] <- c("glmnet") 
} else {
    # glbMdlFamilies[["All.X"]] <- c("glmnet", "glm")
    glbMdlFamilies[["All.X"]] <- c("glmnet")    
    # glbMdlFamilies[["RFE.X"]] <- c("glmnet", "glm")
    # glbMdlFamilies[["RFE.X"]] <- setdiff(glbMdlMethods, c(NULL
    #     , "bayesglm" # error: Error in trControl$classProbs && any(classLevels != make.names(classLevels)) : invalid 'x' type in 'x && y'
    #     , "lda","lda2" # error: Error in lda.default(x, grouping, ...) : variable 236 appears to be constant within groups
    #     , "svmLinear" # Error in .local(object, ...) : test vector does not match model ! In addition: Warning messages:
    #     , "svmLinear2" # SVM has not been trained using `probability = TRUE`, probabilities not available for predictions
    #     , "svmPoly" # runs 75 models per cv sample for tunelength=5 # took > 2 hrs # Error in .local(object, ...) : test vector does not match model !     
    #     , "svmRadial" # didn't bother
    #     ,"xgbLinear","xgbTree" # Need clang-omp compiler; Upgrade to Revolution R 3.2.3 (3.2.2 current); https://github.com/dmlc/xgboost/issues/276 thread
    #                                     ))
}
# glbMdlFamilies[["All.X.Inc"]] <- glbMdlFamilies[["All.X"]] # value not used
# glbMdlFamilies[["RFE.X.Inc"]] <- glbMdlFamilies[["RFE.X"]] # value not used

# Check if interaction features make RFE better
# glbMdlFamilies[["CSM.X"]] <- setdiff(glbMdlMethods, c("lda", "lda2")) # crashing due to category:.clusterid ??? #c("glmnet", "glm") # non-NULL list is mandatory
# glb_mdl_feats_lst[["CSM.X"]] <- c(NULL
#     , <comma-separated-features-vector>
#                                   )
# dAFeats.CSM.X %<d-% c(NULL
#     # Interaction feats up to varImp(RFE.X.glmnet) >= 50
#     , <comma-separated-features-vector>
#     , setdiff(myextract_actual_feats(predictors(glbRFEResults)), c(NULL
#                , <comma-separated-features-vector>
#                                                                       ))    
#                                   )
# glb_mdl_feats_lst[["CSM.X"]] <- "%<d-% dAFeats.CSM.X"

# glbMdlFamilies[["Final"]] <- c(NULL) # NULL vector acceptable # c("glmnet", "glm")

glbMdlAllowParallel <- list()
#glbMdlAllowParallel[["Final##rcv#glmnet"]] <- FALSE
glbMdlAllowParallel[["All.X##rcv#glm"]] <- FALSE
glbMdlAllowParallel[["All.X#ica#rcv#glmnet"]] <- FALSE
glbMdlAllowParallel[["All.X#zv.pca#rcv#glmnet"]] <- FALSE
glbMdlAllowParallel[["All.X#zv.pca.spatialSign#rcv#glmnet"]] <- FALSE

glbMdlAllowParallel[["Final.All.X#zv.pca#rcv#glmnet"]] <- FALSE

# Check if tuning parameters make fit better; make it mdlFamily customizable ?
glbMdlTuneParams <- data.frame()
# When glmnet crashes at model$grid with error: ???
AllX__rcv_glmnetTuneParams <- rbind(data.frame()
    ,data.frame(parameter = "alpha",  vals = "0.100 0.325 0.550 0.775 1.000")
    ,data.frame(parameter = "lambda", vals = "0.05 0.06367626 0.07 0.08 0.09167068")
                        )
AllX_zvpca_rcv_glmnetTuneParams <- rbind(data.frame()
    ,data.frame(parameter = "alpha",  vals = "0.100 0.325 0.550 0.775 1.000")
    ,data.frame(parameter = "lambda", vals = "0.0055615497 0.01 0.0258144271 0.03 0.0460673")
                        ) # max.Accuracy.OOB = 0.6020202 @ 0.55 0.03
# AllX_expoTransspatialSign_rcv_glmnetTuneParams <- rbind(data.frame()
#     ,data.frame(parameter = "alpha",  vals = "0.100 0.325 0.550 0.775 1.000")
#     ,data.frame(parameter = "lambda", vals = "0.0072065998 0.02 0.0334500732 0.04 0.05969355")
#                         ) # max.Accuracy.OOB = 0.5956175 @ 0.325 0.03345007
# FinalAllX__rcv_glmnetTuneParams <- rbind(data.frame()
#     ,data.frame(parameter = "alpha",  vals = "0.100 0.325 0.550 0.775 1.000")
#     ,data.frame(parameter = "lambda", vals = "6.451187e-03 0.02 2.994376e-02 0.04 0.05343633")
#                         )
# FinalAllX_expoTransspatialSign_rcv_glmnetTuneParams <- rbind(data.frame()
#     ,data.frame(parameter = "alpha",  vals = "0.100 0.325 0.550 0.775 1.000")
#     ,data.frame(parameter = "lambda", vals = "6.487621e-03 0.02 3.011287e-02 0.04 0.05373812")
#                         ) # max.Accuracy.fit = 0.5991618 @ 0.55 0.03011287
glbMdlTuneParams <- rbind(glbMdlTuneParams
    ,cbind(data.frame(mdlId = "All.X##rcv#glmnet"),            AllX__rcv_glmnetTuneParams)
    ,cbind(data.frame(mdlId = "All.X#zv.pca#rcv#glmnet"),
                                AllX_zvpca_rcv_glmnetTuneParams)
    # ,cbind(data.frame(mdlId = "All.X#expoTrans.spatialSign#rcv#glmnet")
    #                             AllX_expoTransspatialSign_rcv_glmnetTuneParams)
    # ,cbind(data.frame(mdlId = "Final.All.X##rcv#glmnet"), FinalAllX__rcv_glmnetTuneParams)
    # ,cbind(data.frame(mdlId = "Final.All.X#expoTrans.spatialSign#rcv#glmnet")
    #                             FinalAllX_expoTransspatialSign_rcv_glmnetTuneParams)
)

    #avNNet    
    #   size=[1] 3 5 7 9; decay=[0] 1e-04 0.001  0.01   0.1; bag=[FALSE]; RMSE=1.3300906 

    #bagEarth
    #   degree=1 [2] 3; nprune=64 128 256 512 [1024]; RMSE=0.6486663 (up)
bagEarthTuneParams <- rbind(data.frame()
                        ,data.frame(parameter = "degree", vals = "1")
                        ,data.frame(parameter = "nprune", vals = "256")
                        )
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams,
#                                cbind(data.frame(mdlId = "Final.RFE.X.Inc##rcv#bagEarth"),
#                                      bagEarthTuneParams))

# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "bagEarth", parameter = "nprune", vals = "256")
#     ,data.frame(method = "bagEarth", parameter = "degree", vals = "2")    
# ))

    #earth 
    #   degree=[1]; nprune=2  [9] 17 25 33; RMSE=0.1334478
    
    #gbm 
    #   shrinkage=0.05 [0.10] 0.15 0.20 0.25; n.trees=100 150 200 [250] 300; interaction.depth=[1] 2 3 4 5; n.minobsinnode=[10]; RMSE=0.2008313     
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "gbm", parameter = "shrinkage", min = 0.05, max = 0.25, by = 0.05)
#     ,data.frame(method = "gbm", parameter = "n.trees", min = 100, max = 300, by = 50)
#     ,data.frame(method = "gbm", parameter = "interaction.depth", min = 1, max = 5, by = 1)
#     ,data.frame(method = "gbm", parameter = "n.minobsinnode", min = 10, max = 10, by = 10)
#     #seq(from=0.05,  to=0.25, by=0.05)
# ))

    #glmnet
    #   alpha=0.100 [0.325] 0.550 0.775 1.000; lambda=0.0005232693 0.0024288010 0.0112734954 [0.0523269304] 0.2428800957; RMSE=0.6164891
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "glmnet", parameter = "alpha", vals = "0.550 0.775 0.8875 0.94375 1.000")
#     ,data.frame(method = "glmnet", parameter = "lambda", vals = "9.858855e-05 0.0001971771 0.0009152152 0.0042480525 0.0197177130")    
# ))

    #nnet    
    #   size=3 5 [7] 9 11; decay=0.0001 0.001 0.01 [0.1] 0.2; RMSE=0.9287422
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "nnet", parameter = "size", vals = "3 5 7 9 11")
#     ,data.frame(method = "nnet", parameter = "decay", vals = "0.0001 0.0010 0.0100 0.1000 0.2000")    
# ))

    #rf # Don't bother; results are not deterministic
    #       mtry=2  35  68 [101] 134; RMSE=0.1339974
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "rf", parameter = "mtry", vals = "2 5 9 13 17")
# ))

    #rpart 
    #   cp=0.020 [0.025] 0.030 0.035 0.040; RMSE=0.1770237
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()    
#     ,data.frame(method = "rpart", parameter = "cp", vals = "0.004347826 0.008695652 0.017391304 0.021739130 0.034782609")
# ))
    
    #svmLinear
    #   C=0.01 0.05 [0.10] 0.50 1.00 2.00 3.00 4.00; RMSE=0.1271318; 0.1296718
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "svmLinear", parameter = "C", vals = "0.01 0.05 0.1 0.5 1")
# ))

    #svmLinear2    
    #   cost=0.0625 0.1250 [0.25] 0.50 1.00; RMSE=0.1276354 
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method = "svmLinear2", parameter = "cost", vals = "0.0625 0.125 0.25 0.5 1")
# ))

    #svmPoly    
    #   degree=[1] 2 3 4 5; scale=0.01 0.05 [0.1] 0.5 1; C=0.50 1.00 [2.00] 3.00 4.00; RMSE=0.1276130
# glbMdlTuneParams <- myrbind_df(glbMdlTuneParams, rbind(data.frame()
#     ,data.frame(method="svmPoly", parameter="degree", min=1, max=5, by=1) #seq(1, 5, 1)
#     ,data.frame(method="svmPoly", parameter="scale", vals="0.01, 0.05, 0.1, 0.5, 1")
#     ,data.frame(method="svmPoly", parameter="C", vals="0.50, 1.00, 2.00, 3.00, 4.00")    
# ))

    #svmRadial
    #   sigma=[0.08674323]; C=0.25 0.50 1.00 [2.00] 4.00; RMSE=0.1614957
    
#glb2Sav(); all.equal(sav_models_df, glb_models_df)

pkgPreprocMethods <-     
# caret version: 6.0.068 # packageVersion("caret")
# operations are applied in this order: zero-variance filter, near-zero variance filter, Box-Cox/Yeo-Johnson/exponential transformation, centering, scaling, range, imputation, PCA, ICA then spatial sign
# *Impute methods needed only if NAs are fed to myfit_mdl
#   Also, ordered.factor in caret creates features as Edn.fctr^4 which is treated as an exponent by bagImpute
    c(NULL
      ,"zv", "nzv"
      ,"BoxCox", "YeoJohnson", "expoTrans"
      ,"center", "scale", "center.scale", "range"
      ,"knnImpute", "bagImpute", "medianImpute"
      ,"zv.pca", "ica", "spatialSign"
      ,"conditionalX") 

glbMdlPreprocMethods <- list(# NULL # : default
    "All.X" = list("glmnet" = union(setdiff(pkgPreprocMethods,
                                            c("knnImpute", "bagImpute", "medianImpute")),
                                    # c(NULL)))
                                    c("zv.pca.spatialSign")))
)
# glbMdlPreprocMethods[["RFE.X"]] <- list("glmnet" = union(unlist(glbMdlPreprocMethods[["All.X"]]),
#                                                     "nzv.pca.spatialSign"))

# Baseline prediction model feature(s)
glb_Baseline_mdl_var <- NULL # or c("<feat>")

glbMdlMetric_terms <- NULL # or matrix(c(
#                               0,1,2,3,4,
#                               2,0,1,2,3,
#                               4,2,0,1,2,
#                               6,4,2,0,1,
#                               8,6,4,2,0
#                           ), byrow=TRUE, nrow=5)
glbMdlMetricSummary <- NULL # or "<metric_name>"
glbMdlMetricMaximize <- NULL # or FALSE (TRUE is not the default for both classification & regression) 
glbMdlMetricSummaryFn <- NULL # or function(data, lev=NULL, model=NULL) {
#     confusion_mtrx <- t(as.matrix(confusionMatrix(data$pred, data$obs)))
#     #print(confusion_mtrx)
#     #print(confusion_mtrx * glbMdlMetric_terms)
#     metric <- sum(confusion_mtrx * glbMdlMetric_terms) / nrow(data)
#     names(metric) <- glbMdlMetricSummary
#     return(metric)
# }

glbMdlCheckRcv <- FALSE # Turn it on when needed; otherwise takes long time
glb_rcv_n_folds <- 3 # or NULL
glb_rcv_n_repeats <- 3 # or NULL

glb_clf_proba_threshold <- NULL # 0.5

# Model selection criteria
if (glb_is_regression)
    glbMdlMetricsEval <- c("min.RMSE.OOB", "max.R.sq.OOB", "min.elapsedtime.everything",
                           "max.Adj.R.sq.fit", "min.RMSE.fit")
    #glbMdlMetricsEval <- c("min.RMSE.fit", "max.R.sq.fit", "max.Adj.R.sq.fit")    
if (glb_is_classification) {
    if (glb_is_binomial)
        glbMdlMetricsEval <- 
            c("max.Accuracy.OOB", "max.AUCROCR.OOB", "max.AUCpROC.OOB",
              "min.elapsedtime.everything", 
              # "min.aic.fit", 
              "max.Accuracy.fit") else        
        glbMdlMetricsEval <- c("max.Accuracy.OOB", "max.Kappa.OOB", "min.elapsedtime.everything")
}

# select from NULL [no ensemble models], "auto" [all models better than MFO or Baseline], c(mdl_ids in glb_models_lst) [Typically top-rated models in auto]
glbMdlEnsemble <- NULL #"auto"
#     "%<d-% setdiff(mygetEnsembleAutoMdlIds(), 'CSM.X.rf')" 
#     c(<comma-separated-mdlIds>
#      )

# Only for classifications; for regressions remove "(.*)\\.prob" form the regex
# tmp_fitobs_df <- glbObsFit[, grep(paste0("^", gsub(".", "\\.", mygetPredictIds$value, fixed = TRUE), "CSM\\.X\\.(.*)\\.prob"), names(glbObsFit), value = TRUE)]; cor_mtrx <- cor(tmp_fitobs_df); cor_vctr <- sort(cor_mtrx[row.names(orderBy(~-Overall, varImp(glb_models_lst[["Ensemble.repeatedcv.glmnet"]])$imp))[1], ]); summary(cor_vctr); cor_vctr
#ntv.glm <- glm(reformulate(indepVar, glb_rsp_var), family = "binomial", data = glbObsFit)
#step.glm <- step(ntv.glm)

glbMdlSelId <- NULL #select from c(NULL, "All.X##rcv#glmnet", "RFE.X##rcv#glmnet", <mdlId>)
glbMdlFinId <- NULL #select from c(NULL, glbMdlSelId)

glb_dsp_cols <- c(".pos", glbFeatsId, glbFeatsCategory, glb_rsp_var
#               List critical cols excl. above
                  )

# Output specs
# lclgetfltout_df <- function(obsOutFinDf) {
#     require(tidyr)
#     obsOutFinDf <- obsOutFinDf %>%
#         tidyr::separate("ImageId.x.y", c(".src", ".pos", "x", "y"), 
#                         sep = "#", remove = TRUE, extra = "merge")
#     # mnm prefix stands for max_n_mean
#     mnmout_df <- obsOutFinDf %>%
#         dplyr::group_by(.pos) %>%
#         #dplyr::top_n(1, Probability1) %>% # Score = 3.9426         
#         #dplyr::top_n(2, Probability1) %>% # Score = ???; weighted = 3.94254;         
#         #dplyr::top_n(3, Probability1) %>% # Score = 3.9418; weighted = 3.94169; 
#         dplyr::top_n(4, Probability1) %>% # Score = ???; weighted = 3.94149;        
#         #dplyr::top_n(5, Probability1) %>% # Score = 3.9421; weighted = 3.94178
#     
#         # dplyr::summarize(xMeanN = mean(as.numeric(x)), yMeanN = mean(as.numeric(y)))
#         # dplyr::summarize(xMeanN = weighted.mean(as.numeric(x), Probability1), yMeanN = mean(as.numeric(y)))
#         # dplyr::summarize(xMeanN = weighted.mean(as.numeric(x), c(Probability1, 0.2357323, 0.2336925)), yMeanN = mean(as.numeric(y)))    
#         # dplyr::summarize(xMeanN = weighted.mean(as.numeric(x), c(Probability1)), yMeanN = mean(as.numeric(y)))
#         dplyr::summarize(xMeanN = weighted.mean(as.numeric(x), c(Probability1)), 
#                          yMeanN = weighted.mean(as.numeric(y), c(Probability1)))  
#     
#     maxout_df <- obsOutFinDf %>%
#         dplyr::group_by(.pos) %>%
#         dplyr::summarize(maxProb1 = max(Probability1))
#     fltout_df <- merge(maxout_df, obsOutFinDf, 
#                        by.x = c(".pos", "maxProb1"), by.y = c(".pos", "Probability1"),
#                        all.x = TRUE)
#     fmnout_df <- merge(fltout_df, mnmout_df, 
#                        by.x = c(".pos"), by.y = c(".pos"),
#                        all.x = TRUE)
#     return(fmnout_df)
# }
glbObsOut <- list(NULL
        # glbFeatsId will be the first output column, by default
        ,vars = list()
#         ,mapFn = function(obsOutFinDf) {
#                   }
                  )
#obsOutFinDf <- savobsOutFinDf
# glbObsOut$mapFn <- function(obsOutFinDf) {
#     txfout_df <- dplyr::select(obsOutFinDf, -.pos.y) %>%
#         dplyr::mutate(
#             lunch     = levels(glbObsTrn[, "lunch"    ])[
#                        round(mean(as.numeric(glbObsTrn[, "lunch"    ])), 0)],
#             dinner    = levels(glbObsTrn[, "dinner"   ])[
#                        round(mean(as.numeric(glbObsTrn[, "dinner"   ])), 0)],
#             reserve   = levels(glbObsTrn[, "reserve"  ])[
#                        round(mean(as.numeric(glbObsTrn[, "reserve"  ])), 0)],
#             outdoor   = levels(glbObsTrn[, "outdoor"  ])[
#                        round(mean(as.numeric(glbObsTrn[, "outdoor"  ])), 0)],
#             expensive = levels(glbObsTrn[, "expensive"])[
#                        round(mean(as.numeric(glbObsTrn[, "expensive"])), 0)],
#             liquor    = levels(glbObsTrn[, "liquor"   ])[
#                        round(mean(as.numeric(glbObsTrn[, "liquor"   ])), 0)],
#             table     = levels(glbObsTrn[, "table"    ])[
#                        round(mean(as.numeric(glbObsTrn[, "table"    ])), 0)],
#             classy    = levels(glbObsTrn[, "classy"   ])[
#                        round(mean(as.numeric(glbObsTrn[, "classy"   ])), 0)],
#             kids      = levels(glbObsTrn[, "kids"     ])[
#                        round(mean(as.numeric(glbObsTrn[, "kids"     ])), 0)]
#                       )
#     
#     print("ObsNew output class tables:")
#     print(sapply(c("lunch","dinner","reserve","outdoor",
#                    "expensive","liquor","table",
#                    "classy","kids"), 
#                  function(feat) table(txfout_df[, feat], useNA = "ifany")))
#     
#     txfout_df <- txfout_df %>%
#         dplyr::mutate(labels = "") %>%
#         dplyr::mutate(labels = 
#     ifelse(lunch     != "-1", paste(labels, lunch    ), labels)) %>%
#         dplyr::mutate(labels = 
#     ifelse(dinner    != "-1", paste(labels, dinner   ), labels)) %>%
#         dplyr::mutate(labels = 
#     ifelse(reserve   != "-1", paste(labels, reserve  ), labels)) %>%
#         dplyr::mutate(labels = 
#     ifelse(outdoor   != "-1", paste(labels, outdoor  ), labels)) %>%
#         dplyr::mutate(labels =         
#     ifelse(expensive != "-1", paste(labels, expensive), labels)) %>%
#         dplyr::mutate(labels =         
#     ifelse(liquor    != "-1", paste(labels, liquor   ), labels)) %>%
#         dplyr::mutate(labels =         
#     ifelse(table     != "-1", paste(labels, table    ), labels)) %>%
#         dplyr::mutate(labels =         
#     ifelse(classy    != "-1", paste(labels, classy   ), labels)) %>%
#         dplyr::mutate(labels =         
#     ifelse(kids      != "-1", paste(labels, kids     ), labels)) %>%
#         dplyr::select(business_id, labels)
#     return(txfout_df)
# }
#if (!is.null(glbObsOut$mapFn)) obsOutFinDf <- glbObsOut$mapFn(obsOutFinDf); print(head(obsOutFinDf))

glb_out_obs <- NULL # select from c(NULL : default to "new", "all", "new", "trn")

if (glb_is_classification && glb_is_binomial) {
    # glbObsOut$vars[["Probability1"]] <- 
    #     "%<d-% glbObsNew[, mygetPredictIds(glb_rsp_var, glbMdlFinId)$prob]" 
    # glbObsOut$vars[[glb_rsp_var_raw]] <-
    #     "%<d-% glb_map_rsp_var_to_raw(glbObsNew[,
    #                                         mygetPredictIds(glb_rsp_var, glbMdlFinId)$value])"
    glbObsOut$vars[["Predictions"]] <-
        "%<d-% glb_map_rsp_var_to_raw(glbObsNew[,
                                            mygetPredictIds(glb_rsp_var, glbMdlFinId)$value])"
} else {
#     glbObsOut$vars[[glbFeatsId]] <- 
#         "%<d-% as.integer(gsub('Test#', '', glbObsNew[, glbFeatsId]))"
    glbObsOut$vars[[glb_rsp_var]] <- 
        "%<d-% glbObsNew[, mygetPredictIds(glb_rsp_var, glbMdlFinId)$value]"
#     for (outVar in setdiff(glbFeatsExcludeLcl, glb_rsp_var_raw))
#         glbObsOut$vars[[outVar]] <- 
#             paste0("%<d-% mean(glbObsAll[, \"", outVar, "\"], na.rm = TRUE)")
}    
# glbObsOut$vars[[glb_rsp_var_raw]] <- glb_rsp_var_raw
# glbObsOut$vars[[paste0(head(unlist(strsplit(mygetPredictIds$value, "")), -1), collapse = "")]] <-

glbOutStackFnames <- # NULL #: default
    c("Votes_Ensemble_cnk06_out_fin.csv") # manual stack
    # c("ebayipads_finmdl_bid1_out_nnet_1.csv") # universal stack

glbOut <- list(pfx = "Q109244No_AllXpreProc_cnk04_fit.models_1_")
# lclImageSampleSeed <- 129
glbOutDataVizFname <- NULL # choose from c(NULL, "<projectId>_obsall.csv")


glbChunks <- list(labels = c("set_global_options_wd","set_global_options"
    ,"import.data","inspect.data","scrub.data","transform.data"
    ,"extract.features"
        ,"extract.features.datetime","extract.features.image","extract.features.price"
        ,"extract.features.text","extract.features.string"  
        ,"extract.features.end"
    ,"manage.missing.data","cluster.data","partition.data.training","select.features"
    ,"fit.models_0","fit.models_1","fit.models_2","fit.models_3"
    ,"fit.data.training_0","fit.data.training_1"
    ,"predict.data.new"         
    ,"display.session.info"))
# To ensure that all chunks in this script are in glbChunks
if (!is.null(chkChunksLabels <- knitr::all_labels()) && # knitr::all_labels() doesn't work in console runs
    !identical(chkChunksLabels, glbChunks$labels)) {
    print(sprintf("setdiff(chkChunksLabels, glbChunks$labels): %s", 
                  setdiff(chkChunksLabels, glbChunks$labels)))    
    print(sprintf("setdiff(glbChunks$labels, chkChunksLabels): %s", 
                  setdiff(glbChunks$labels, chkChunksLabels)))    
}

glbChunks[["first"]] <- "scrub.data" # NULL # default: script will load envir from previous chunk
glbChunks[["last" ]] <- "fit.models_1" # default: script will save envir at end of this chunk 
glbChunks[["inpFilePathName"]] <- "data/Q109244No_category_cnk01_inspect.data_inspect.data.RData" # NULL: default or "data/<prvScriptName>_<lstChunkLbl>.RData"
#mysavChunk(glbOut$pfx, glbChunks[["last"]]) # called from myevlChunk
# Temporary: Delete this function (if any) from here after appropriate .RData file is saved

# Inspect max OOB FP
#chkObsOOB <- subset(glbObsOOB, !label.fctr.All.X..rcv.glmnet.is.acc)
#chkObsOOBFP <- subset(chkObsOOB, label.fctr.All.X..rcv.glmnet == "left_eye_center") %>% dplyr::mutate(Probability1 = label.fctr.All.X..rcv.glmnet.prob) %>% select(-.src, -.pos, -x, -y) %>% lclgetfltout_df() %>% mutate(obj.distance = (((as.numeric(x) - left_eye_center_x.int) ^ 2) + ((as.numeric(y) - left_eye_center_y.int) ^ 2)) ^ 0.5) %>% dplyr::top_n(5, obj.distance) %>% dplyr::top_n(5, -patch.cor)
#
#newImgObs <- glbObsNew[(glbObsNew$ImageId == "Test#0001"), ]; print(newImgObs[which.max(newImgObs$label.fctr.Final..rcv.glmnet.prob), ])
#OOBImgObs <- glbObsOOB[(glbObsOOB$ImageId == "Train#0003"), ]; print(OOBImgObs[which.max(OOBImgObs$label.fctr.All.X..rcv.glmnet.prob), ])

#mygetImage(which(glbObsAll[, glbFeatsId] == "Train#0003"), names(glbFeatsImage)[1], plot = TRUE, featHighlight = c("left_eye_center_x", "left_eye_center_y"), ovrlHighlight = c(66, 35))

# Depict process
glb_analytics_pn <- petrinet(name = "glb_analytics_pn",
                        trans_df = data.frame(id = 1:6,
    name = c("data.training.all","data.new",
           "model.selected","model.final",
           "data.training.all.prediction","data.new.prediction"),
    x=c(   -5,-5,-15,-25,-25,-35),
    y=c(   -5, 5,  0,  0, -5,  5)
                        ),
                        places_df=data.frame(id=1:4,
    name=c("bgn","fit.data.training.all","predict.data.new","end"),
    x=c(   -0,   -20,                    -30,               -40),
    y=c(    0,     0,                      0,                 0),
    M0=c(   3,     0,                      0,                 0)
                        ),
                        arcs_df = data.frame(
    begin = c("bgn","bgn","bgn",        
            "data.training.all","model.selected","fit.data.training.all",
            "fit.data.training.all","model.final",    
            "data.new","predict.data.new",
            "data.training.all.prediction","data.new.prediction"),
    end   = c("data.training.all","data.new","model.selected",
            "fit.data.training.all","fit.data.training.all","model.final",
            "data.training.all.prediction","predict.data.new",
            "predict.data.new","data.new.prediction",
            "end","end")
                        ))
#print(ggplot.petrinet(glb_analytics_pn))
print(ggplot.petrinet(glb_analytics_pn) + coord_flip())
## Loading required package: grid

glb_analytics_avl_objs <- NULL

glb_chunks_df <- myadd_chunk(NULL, 
                             ifelse(is.null(glbChunks$first), "import.data", glbChunks$first))
##        label step_major step_minor label_minor   bgn end elapsed
## 1 scrub.data          1          0           0 6.165  NA      NA

Step 1.0: scrub data

chunk option: eval=

Step 1.0: scrub data

Step 1.0: scrub data

```{r scrub.data, cache=FALSE, echo=FALSE, eval=myevlChunk(glbChunks, glbOut$pfx)}

## [1] "numeric data missing in : "
##        YOB Party.fctr 
##        128        622 
## [1] "numeric data w/ 0s in : "
## YOB.Age.dff 
##         136 
## [1] "numeric data w/ Infs in : "
## named integer(0)
## [1] "numeric data w/ NaNs in : "
## named integer(0)
## [1] "string data missing in : "
##          Gender          Income HouseholdStatus  EducationLevel 
##              46             445             177             410 
##           Party         Q124742         Q124122         Q123464 
##              NA            1438             823             708 
##         Q123621         Q122769         Q122770         Q122771 
##             778             644             594             587 
##         Q122120         Q121699         Q121700         Q120978 
##             585             547             563             599 
##         Q121011         Q120379         Q120650         Q120472 
##             571             607             655             649 
##         Q120194         Q120012         Q120014         Q119334 
##             654             591             641             568 
##         Q119851         Q119650         Q118892         Q118117 
##             540             578             486             479 
##         Q118232         Q118233         Q118237         Q117186 
##             701             554             539             648 
##         Q117193         Q116797         Q116881         Q116953 
##             655             590             635             616 
##         Q116601         Q116441         Q116448         Q116197 
##             534             541             560             551 
##         Q115602         Q115777         Q115610         Q115611 
##             539             578             537             487 
##         Q115899         Q115390         Q114961         Q114748 
##             573             619             538             447 
##         Q115195         Q114517         Q114386         Q113992 
##             525             481             521             447 
##         Q114152         Q113583         Q113584         Q113181 
##             537             514             512             453 
##         Q112478         Q112512         Q112270         Q111848 
##             494             460             521             398 
##         Q111580         Q111220         Q110740         Q109367 
##             474             379             357             168 
##         Q108950         Q109244         Q108855         Q108617 
##             204               0             438             288 
##         Q108856         Q108754         Q108342         Q108343 
##             436             338             341             333 
##         Q107869         Q107491         Q106993         Q106997 
##             389             366             389             396 
##         Q106272         Q106388         Q106389         Q106042 
##             426             476             495             451 
##         Q105840         Q105655         Q104996         Q103293 
##             487             393             400             431 
##         Q102906         Q102674         Q102687         Q102289 
##             493             511             475             484 
##         Q102089         Q101162         Q101163         Q101596 
##             462             498             572             477 
##         Q100689         Q100680         Q100562          Q99982 
##             414             497             487             514 
##         Q100010          Q99716          Q99581          Q99480 
##             445             500             466             478 
##          Q98869          Q98578          Q98059          Q98078 
##             564             542             450             569 
##          Q98197          Q96024 
##             528             550
## Warning in rm(pltObsSmp): object 'pltObsSmp' not found
##            label step_major step_minor label_minor   bgn   end elapsed
## 1     scrub.data          1          0           0 6.165 7.115    0.95
## 2 transform.data          1          1           1 7.116    NA      NA

Step 1.1: transform data

##              label step_major step_minor label_minor   bgn   end elapsed
## 2   transform.data          1          1           1 7.116 7.151   0.035
## 3 extract.features          2          0           0 7.152    NA      NA

Step 2.0: extract features

##                       label step_major step_minor label_minor   bgn   end
## 3          extract.features          2          0           0 7.152 7.167
## 4 extract.features.datetime          2          1           1 7.168    NA
##   elapsed
## 3   0.015
## 4      NA

Step 2.1: extract features datetime

##                           label step_major step_minor label_minor   bgn
## 1 extract.features.datetime.bgn          1          0           0 7.189
##   end elapsed
## 1  NA      NA
## Warning in rm(pltObsAll): object 'pltObsAll' not found
##                       label step_major step_minor label_minor   bgn   end
## 4 extract.features.datetime          2          1           1 7.168 7.198
## 5    extract.features.image          2          2           2 7.198    NA
##   elapsed
## 4    0.03
## 5      NA

Step 2.2: extract features image

```{r extract.features.image, cache=FALSE, echo=FALSE, fig.height=5, fig.width=5, eval=myevlChunk(glbChunks, glbOut$pfx)}

##                        label step_major step_minor label_minor   bgn end
## 1 extract.features.image.bgn          1          0           0 7.224  NA
##   elapsed
## 1      NA
##                        label step_major step_minor label_minor   bgn   end
## 1 extract.features.image.bgn          1          0           0 7.224 7.231
## 2 extract.features.image.end          2          0           0 7.232    NA
##   elapsed
## 1   0.007
## 2      NA
##                        label step_major step_minor label_minor   bgn   end
## 1 extract.features.image.bgn          1          0           0 7.224 7.231
## 2 extract.features.image.end          2          0           0 7.232    NA
##   elapsed
## 1   0.007
## 2      NA
##                    label step_major step_minor label_minor   bgn   end
## 5 extract.features.image          2          2           2 7.198 7.239
## 6 extract.features.price          2          3           3 7.239    NA
##   elapsed
## 5   0.041
## 6      NA

Step 2.3: extract features price

##                        label step_major step_minor label_minor   bgn end
## 1 extract.features.price.bgn          1          0           0 7.258  NA
##   elapsed
## 1      NA
##                    label step_major step_minor label_minor   bgn   end
## 6 extract.features.price          2          3           3 7.239 7.265
## 7  extract.features.text          2          4           4 7.265    NA
##   elapsed
## 6   0.026
## 7      NA

Step 2.4: extract features text

##                       label step_major step_minor label_minor bgn end
## 1 extract.features.text.bgn          1          0           0 7.3  NA
##   elapsed
## 1      NA
## Warning in rm(tmp_allobs_df): object 'tmp_allobs_df' not found
## Warning in rm(tmp_trnobs_df): object 'tmp_trnobs_df' not found
##                     label step_major step_minor label_minor   bgn  end
## 7   extract.features.text          2          4           4 7.265 7.31
## 8 extract.features.string          2          5           5 7.310   NA
##   elapsed
## 7   0.045
## 8      NA

Step 2.5: extract features string

##                         label step_major step_minor label_minor   bgn end
## 1 extract.features.string.bgn          1          0           0 7.336  NA
##   elapsed
## 1      NA
##                                       label step_major step_minor
## 1               extract.features.string.bgn          1          0
## 2 extract.features.stringfactorize.str.vars          2          0
##   label_minor   bgn   end elapsed
## 1           0 7.336 7.343   0.007
## 2           0 7.343    NA      NA
##            Gender            Income   HouseholdStatus    EducationLevel 
##          "Gender"          "Income" "HouseholdStatus"  "EducationLevel" 
##             Party           Q124742           Q124122           Q123464 
##           "Party"         "Q124742"         "Q124122"         "Q123464" 
##           Q123621           Q122769           Q122770           Q122771 
##         "Q123621"         "Q122769"         "Q122770"         "Q122771" 
##           Q122120           Q121699           Q121700           Q120978 
##         "Q122120"         "Q121699"         "Q121700"         "Q120978" 
##           Q121011           Q120379           Q120650           Q120472 
##         "Q121011"         "Q120379"         "Q120650"         "Q120472" 
##           Q120194           Q120012           Q120014           Q119334 
##         "Q120194"         "Q120012"         "Q120014"         "Q119334" 
##           Q119851           Q119650           Q118892           Q118117 
##         "Q119851"         "Q119650"         "Q118892"         "Q118117" 
##           Q118232           Q118233           Q118237           Q117186 
##         "Q118232"         "Q118233"         "Q118237"         "Q117186" 
##           Q117193           Q116797           Q116881           Q116953 
##         "Q117193"         "Q116797"         "Q116881"         "Q116953" 
##           Q116601           Q116441           Q116448           Q116197 
##         "Q116601"         "Q116441"         "Q116448"         "Q116197" 
##           Q115602           Q115777           Q115610           Q115611 
##         "Q115602"         "Q115777"         "Q115610"         "Q115611" 
##           Q115899           Q115390           Q114961           Q114748 
##         "Q115899"         "Q115390"         "Q114961"         "Q114748" 
##           Q115195           Q114517           Q114386           Q113992 
##         "Q115195"         "Q114517"         "Q114386"         "Q113992" 
##           Q114152           Q113583           Q113584           Q113181 
##         "Q114152"         "Q113583"         "Q113584"         "Q113181" 
##           Q112478           Q112512           Q112270           Q111848 
##         "Q112478"         "Q112512"         "Q112270"         "Q111848" 
##           Q111580           Q111220           Q110740           Q109367 
##         "Q111580"         "Q111220"         "Q110740"         "Q109367" 
##           Q108950           Q109244           Q108855           Q108617 
##         "Q108950"         "Q109244"         "Q108855"         "Q108617" 
##           Q108856           Q108754           Q108342           Q108343 
##         "Q108856"         "Q108754"         "Q108342"         "Q108343" 
##           Q107869           Q107491           Q106993           Q106997 
##         "Q107869"         "Q107491"         "Q106993"         "Q106997" 
##           Q106272           Q106388           Q106389           Q106042 
##         "Q106272"         "Q106388"         "Q106389"         "Q106042" 
##           Q105840           Q105655           Q104996           Q103293 
##         "Q105840"         "Q105655"         "Q104996"         "Q103293" 
##           Q102906           Q102674           Q102687           Q102289 
##         "Q102906"         "Q102674"         "Q102687"         "Q102289" 
##           Q102089           Q101162           Q101163           Q101596 
##         "Q102089"         "Q101162"         "Q101163"         "Q101596" 
##           Q100689           Q100680           Q100562            Q99982 
##         "Q100689"         "Q100680"         "Q100562"          "Q99982" 
##           Q100010            Q99716            Q99581            Q99480 
##         "Q100010"          "Q99716"          "Q99581"          "Q99480" 
##            Q98869            Q98578            Q98059            Q98078 
##          "Q98869"          "Q98578"          "Q98059"          "Q98078" 
##            Q98197            Q96024              .src 
##          "Q98197"          "Q96024"            ".src"
##                     label step_major step_minor label_minor   bgn   end
## 8 extract.features.string          2          5           5 7.310 7.359
## 9    extract.features.end          2          6           6 7.359    NA
##   elapsed
## 8   0.049
## 9      NA

Step 2.6: extract features end

## time trans    "bgn " "fit.data.training.all " "predict.data.new " "end " 
## 0.0000   multiple enabled transitions:  data.training.all data.new model.selected    firing:  data.training.all 
## 1.0000    1   2 1 0 0 
## 1.0000   multiple enabled transitions:  data.training.all data.new model.selected model.final data.training.all.prediction   firing:  data.new 
## 2.0000    2   1 1 1 0

##                   label step_major step_minor label_minor   bgn   end
## 9  extract.features.end          2          6           6 7.359 7.987
## 10  manage.missing.data          3          0           0 7.987    NA
##    elapsed
## 9    0.628
## 10      NA

Step 3.0: manage missing data

## [1] "numeric data missing in : "
##        YOB Party.fctr 
##        128        622 
## [1] "numeric data w/ 0s in : "
## YOB.Age.dff 
##         136 
## [1] "numeric data w/ Infs in : "
## named integer(0)
## [1] "numeric data w/ NaNs in : "
## named integer(0)
## [1] "string data missing in : "
##          Gender          Income HouseholdStatus  EducationLevel 
##              46             445             177             410 
##           Party         Q124742         Q124122         Q123464 
##              NA            1438             823             708 
##         Q123621         Q122769         Q122770         Q122771 
##             778             644             594             587 
##         Q122120         Q121699         Q121700         Q120978 
##             585             547             563             599 
##         Q121011         Q120379         Q120650         Q120472 
##             571             607             655             649 
##         Q120194         Q120012         Q120014         Q119334 
##             654             591             641             568 
##         Q119851         Q119650         Q118892         Q118117 
##             540             578             486             479 
##         Q118232         Q118233         Q118237         Q117186 
##             701             554             539             648 
##         Q117193         Q116797         Q116881         Q116953 
##             655             590             635             616 
##         Q116601         Q116441         Q116448         Q116197 
##             534             541             560             551 
##         Q115602         Q115777         Q115610         Q115611 
##             539             578             537             487 
##         Q115899         Q115390         Q114961         Q114748 
##             573             619             538             447 
##         Q115195         Q114517         Q114386         Q113992 
##             525             481             521             447 
##         Q114152         Q113583         Q113584         Q113181 
##             537             514             512             453 
##         Q112478         Q112512         Q112270         Q111848 
##             494             460             521             398 
##         Q111580         Q111220         Q110740         Q109367 
##             474             379             357             168 
##         Q108950         Q109244         Q108855         Q108617 
##             204               0             438             288 
##         Q108856         Q108754         Q108342         Q108343 
##             436             338             341             333 
##         Q107869         Q107491         Q106993         Q106997 
##             389             366             389             396 
##         Q106272         Q106388         Q106389         Q106042 
##             426             476             495             451 
##         Q105840         Q105655         Q104996         Q103293 
##             487             393             400             431 
##         Q102906         Q102674         Q102687         Q102289 
##             493             511             475             484 
##         Q102089         Q101162         Q101163         Q101596 
##             462             498             572             477 
##         Q100689         Q100680         Q100562          Q99982 
##             414             497             487             514 
##         Q100010          Q99716          Q99581          Q99480 
##             445             500             466             478 
##          Q98869          Q98578          Q98059          Q98078 
##             564             542             450             569 
##          Q98197          Q96024 
##             528             550
## [1] "numeric data missing in : "
##        YOB Party.fctr 
##        128        622 
## [1] "numeric data w/ 0s in : "
## YOB.Age.dff 
##         136 
## [1] "numeric data w/ Infs in : "
## named integer(0)
## [1] "numeric data w/ NaNs in : "
## named integer(0)
## [1] "string data missing in : "
##          Gender          Income HouseholdStatus  EducationLevel 
##              46             445             177             410 
##           Party         Q124742         Q124122         Q123464 
##              NA            1438             823             708 
##         Q123621         Q122769         Q122770         Q122771 
##             778             644             594             587 
##         Q122120         Q121699         Q121700         Q120978 
##             585             547             563             599 
##         Q121011         Q120379         Q120650         Q120472 
##             571             607             655             649 
##         Q120194         Q120012         Q120014         Q119334 
##             654             591             641             568 
##         Q119851         Q119650         Q118892         Q118117 
##             540             578             486             479 
##         Q118232         Q118233         Q118237         Q117186 
##             701             554             539             648 
##         Q117193         Q116797         Q116881         Q116953 
##             655             590             635             616 
##         Q116601         Q116441         Q116448         Q116197 
##             534             541             560             551 
##         Q115602         Q115777         Q115610         Q115611 
##             539             578             537             487 
##         Q115899         Q115390         Q114961         Q114748 
##             573             619             538             447 
##         Q115195         Q114517         Q114386         Q113992 
##             525             481             521             447 
##         Q114152         Q113583         Q113584         Q113181 
##             537             514             512             453 
##         Q112478         Q112512         Q112270         Q111848 
##             494             460             521             398 
##         Q111580         Q111220         Q110740         Q109367 
##             474             379             357             168 
##         Q108950         Q109244         Q108855         Q108617 
##             204               0             438             288 
##         Q108856         Q108754         Q108342         Q108343 
##             436             338             341             333 
##         Q107869         Q107491         Q106993         Q106997 
##             389             366             389             396 
##         Q106272         Q106388         Q106389         Q106042 
##             426             476             495             451 
##         Q105840         Q105655         Q104996         Q103293 
##             487             393             400             431 
##         Q102906         Q102674         Q102687         Q102289 
##             493             511             475             484 
##         Q102089         Q101162         Q101163         Q101596 
##             462             498             572             477 
##         Q100689         Q100680         Q100562          Q99982 
##             414             497             487             514 
##         Q100010          Q99716          Q99581          Q99480 
##             445             500             466             478 
##          Q98869          Q98578          Q98059          Q98078 
##             564             542             450             569 
##          Q98197          Q96024 
##             528             550
##                  label step_major step_minor label_minor   bgn   end
## 10 manage.missing.data          3          0           0 7.987 8.543
## 11        cluster.data          4          0           0 8.543    NA
##    elapsed
## 10   0.556
## 11      NA

Step 4.0: cluster data

```{r cluster.data, cache=FALSE, echo=FALSE, eval=myevlChunk(glbChunks, glbOut$pfx)}

## Loading required package: proxy
## 
## Attaching package: 'proxy'
## The following objects are masked from 'package:stats':
## 
##     as.dist, dist
## The following object is masked from 'package:base':
## 
##     as.matrix
## Loading required package: dynamicTreeCut
## Loading required package: entropy
## Loading required package: tidyr
## Loading required package: ggdendro
## [1] "Clustering features: "
## Warning in cor(data.matrix(glbObsAll[glbObsAll$.src == "Train",
## glbFeatsCluster]), : the standard deviation is zero
##               abs.cor.y
## Q108855.fctr 0.05525571
## Q116881.fctr 0.05625959
## Q98197.fctr  0.07400689
## Q113181.fctr 0.09842608
## Q115611.fctr 0.10612270
## [1] "    .rnorm cor: -0.0049"
## [1] "  Clustering entropy measure: Party.fctr"
## [1] "glbObsAll Entropy: 0.6810"
## Loading required package: lazyeval
##   Q115611.fctr .clusterid Q115611.fctr.clusterid   D   R  .entropy .knt
## 1           NA          1                   NA_1 167 213 0.6858023  380
## 2           No          1                   No_1 579 614 0.6927168 1193
## 3          Yes          1                  Yes_1 292 594 0.6338745  886
## [1] "glbObsAll$Q115611.fctr Entropy: 0.6704 (98.4550 pct)"
## [1] "Category: NA"
## [1] "max distance(0.9770) pair:"
##      USER_ID Party.fctr Q115611.fctr Q124742.fctr Q124122.fctr
## 2961    3680          R           NA           NA           NA
## 3852    4799          R           NA           NA           NA
##      Q123621.fctr Q123464.fctr Q122771.fctr Q122770.fctr Q122769.fctr
## 2961          Yes           No           Pt          Yes           No
## 3852           NA           NA           NA           NA           NA
##      Q122120.fctr Q121700.fctr Q121699.fctr Q121011.fctr Q120978.fctr
## 2961           No           No          Yes          Yes          Yes
## 3852           NA           NA           NA           NA           NA
##      Q120650.fctr Q120472.fctr Q120379.fctr Q120194.fctr Q120014.fctr
## 2961          Yes      Science          Yes           NA          Yes
## 3852           NA           NA           NA           NA           NA
##      Q120012.fctr Q119851.fctr Q119650.fctr Q119334.fctr Q118892.fctr
## 2961          Yes           No       Giving           No          Yes
## 3852           NA           NA           NA          Yes           No
##      Q118237.fctr Q118233.fctr Q118232.fctr Q118117.fctr   Q117193.fctr
## 2961          Yes           No           Pr           No Standard hours
## 3852           No          Yes           NA           No      Odd hours
##      Q117186.fctr Q116797.fctr Q116881.fctr Q116953.fctr Q116601.fctr
## 2961   Hot headed           NA           NA           NA           NA
## 3852  Cool headed           No        Happy           No          Yes
##      Q116441.fctr Q116448.fctr Q116197.fctr Q115602.fctr Q115777.fctr
## 2961           NA           NA           NA           NA        Start
## 3852          Yes           No           NA           NA        Start
##      Q115610.fctr Q115611.fctr.1 Q115899.fctr Q115390.fctr Q115195.fctr
## 2961           NA             NA           NA           NA           NA
## 3852           NA             NA           Me           NA           NA
##      Q114961.fctr Q114748.fctr Q114517.fctr Q114386.fctr Q114152.fctr
## 2961           NA           NA           NA           NA           NA
## 3852           NA          Yes           No   Mysterious           No
##      Q113992.fctr Q113583.fctr Q113584.fctr Q113181.fctr Q112478.fctr
## 2961           NA           NA           NA           NA           NA
## 3852          Yes        Tunes   Technology           NA           No
##      Q112512.fctr Q112270.fctr Q111848.fctr Q111580.fctr Q111220.fctr
## 2961           NA           NA           NA           NA           NA
## 3852          Yes          Yes           No   Supportive           No
##      Q110740.fctr Q109367.fctr Q109244.fctr Q108950.fctr Q108855.fctr
## 2961           NA           No           No     Cautious       Umm...
## 3852           PC          Yes           No           NA           NA
##      Q108617.fctr Q108856.fctr Q108754.fctr Q108342.fctr Q108343.fctr
## 2961           No        Space          Yes       Online           No
## 3852           NA           NA           NA    In-person           No
##      Q107869.fctr Q107491.fctr Q106993.fctr Q106997.fctr Q106272.fctr
## 2961           No          Yes          Yes           Gr           NA
## 3852          Yes          Yes          Yes           Gr          Yes
##      Q106388.fctr Q106389.fctr Q106042.fctr Q105840.fctr Q105655.fctr
## 2961           NA           NA           NA           NA           NA
## 3852          Yes          Yes           NA           NA           NA
##      Q104996.fctr Q103293.fctr Q102906.fctr Q102674.fctr Q102687.fctr
## 2961           NA           NA           NA           NA           NA
## 3852           NA           NA           NA           NA           NA
##      Q102289.fctr Q102089.fctr Q101162.fctr Q101163.fctr Q101596.fctr
## 2961           NA           NA           NA           NA           NA
## 3852          Yes          Own           NA           NA           NA
##      Q100689.fctr Q100680.fctr Q100562.fctr Q100010.fctr Q99982.fctr
## 2961           NA           NA           NA           NA          NA
## 3852          Yes           No           No          Yes      Check!
##      Q99716.fctr Q99581.fctr Q99480.fctr Q98869.fctr Q98578.fctr
## 2961          NA          NA          NA          NA          NA
## 3852         Yes          No         Yes         Yes          No
##      Q98197.fctr Q98059.fctr Q98078.fctr Q96024.fctr
## 2961          NA          NA          NA          NA
## 3852          NA          NA          NA          NA
## [1] "min distance(0.9547) pair:"
##      USER_ID Party.fctr Q115611.fctr Q124742.fctr Q124122.fctr
## 4973    6212          R           NA           NA           NA
## 4976    6217          D           NA           NA           NA
##      Q123621.fctr Q123464.fctr Q122771.fctr Q122770.fctr Q122769.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q122120.fctr Q121700.fctr Q121699.fctr Q121011.fctr Q120978.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q120650.fctr Q120472.fctr Q120379.fctr Q120194.fctr Q120014.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q120012.fctr Q119851.fctr Q119650.fctr Q119334.fctr Q118892.fctr
## 4973           NA           NA    Receiving           NA          Yes
## 4976           NA           NA           NA           NA           NA
##      Q118237.fctr Q118233.fctr Q118232.fctr Q118117.fctr Q117193.fctr
## 4973           NA           NA           Id          Yes           NA
## 4976           NA           NA           NA           NA           NA
##      Q117186.fctr Q116797.fctr Q116881.fctr Q116953.fctr Q116601.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q116441.fctr Q116448.fctr Q116197.fctr Q115602.fctr Q115777.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q115610.fctr Q115611.fctr.1 Q115899.fctr Q115390.fctr Q115195.fctr
## 4973           NA             NA           NA           NA           NA
## 4976           NA             NA           NA           NA           NA
##      Q114961.fctr Q114748.fctr Q114517.fctr Q114386.fctr Q114152.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           No
##      Q113992.fctr Q113583.fctr Q113584.fctr Q113181.fctr Q112478.fctr
## 4973           NA           NA           NA          Yes          Yes
## 4976          Yes        Tunes   Technology          Yes           NA
##      Q112512.fctr Q112270.fctr Q111848.fctr Q111580.fctr Q111220.fctr
## 4973          Yes          Yes          Yes           NA           NA
## 4976           NA           NA          Yes    Demanding           No
##      Q110740.fctr Q109367.fctr Q109244.fctr  Q108950.fctr Q108855.fctr
## 4973           NA           NA           No      Cautious       Umm...
## 4976          Mac           No           No Risk-friendly         Yes!
##      Q108617.fctr Q108856.fctr Q108754.fctr Q108342.fctr Q108343.fctr
## 4973           NA    Socialize           No    In-person           NA
## 4976           No    Socialize          Yes    In-person           No
##      Q107869.fctr Q107491.fctr Q106993.fctr Q106997.fctr Q106272.fctr
## 4973           NA          Yes           NA           NA           NA
## 4976           No           NA           NA           NA           NA
##      Q106388.fctr Q106389.fctr Q106042.fctr Q105840.fctr Q105655.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q104996.fctr Q103293.fctr Q102906.fctr Q102674.fctr Q102687.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q102289.fctr Q102089.fctr Q101162.fctr Q101163.fctr Q101596.fctr
## 4973           NA           NA           NA           NA           NA
## 4976           NA           NA           NA           NA           NA
##      Q100689.fctr Q100680.fctr Q100562.fctr Q100010.fctr Q99982.fctr
## 4973           NA           NA           NA           NA          NA
## 4976           NA           NA           NA           NA          NA
##      Q99716.fctr Q99581.fctr Q99480.fctr Q98869.fctr Q98578.fctr
## 4973          NA          NA          NA          NA          NA
## 4976          NA          NA          NA          NA          NA
##      Q98197.fctr Q98059.fctr Q98078.fctr Q96024.fctr
## 4973          NA          NA          NA          NA
## 4976          NA          NA          NA          NA
## [1] "Category: No"
## [1] "max distance(0.9733) pair:"
##      USER_ID Party.fctr Q115611.fctr Q124742.fctr Q124122.fctr
## 2007    2489          D           No           NA           NA
## 2084    2593          R           No           No          Yes
##      Q123621.fctr Q123464.fctr Q122771.fctr Q122770.fctr Q122769.fctr
## 2007           NA           No           NA          Yes          Yes
## 2084           No           No           Pt          Yes          Yes
##      Q122120.fctr Q121700.fctr Q121699.fctr Q121011.fctr Q120978.fctr
## 2007          Yes           No          Yes           No           NA
## 2084          Yes           No          Yes          Yes          Yes
##      Q120650.fctr Q120472.fctr Q120379.fctr Q120194.fctr Q120014.fctr
## 2007           NA           NA           NA           NA           NA
## 2084           NA      Science           No    Try first           No
##      Q120012.fctr Q119851.fctr Q119650.fctr Q119334.fctr Q118892.fctr
## 2007           NA           NA           NA           No           No
## 2084          Yes           No       Giving           No           No
##      Q118237.fctr Q118233.fctr Q118232.fctr Q118117.fctr Q117193.fctr
## 2007           NA           NA           NA          Yes           NA
## 2084           NA           NA           NA          Yes           NA
##      Q117186.fctr Q116797.fctr Q116881.fctr Q116953.fctr Q116601.fctr
## 2007           NA           NA           NA           NA          Yes
## 2084           NA           NA           NA           NA           No
##      Q116441.fctr Q116448.fctr Q116197.fctr Q115602.fctr Q115777.fctr
## 2007           No           No         P.M.          Yes        Start
## 2084          Yes          Yes         P.M.           No        Start
##      Q115610.fctr Q115611.fctr.1 Q115899.fctr Q115390.fctr Q115195.fctr
## 2007          Yes             No           NA          Yes          Yes
## 2084          Yes             No           Me           NA           NA
##      Q114961.fctr Q114748.fctr Q114517.fctr Q114386.fctr Q114152.fctr
## 2007           NA          Yes          Yes           NA           No
## 2084           NA           NA           No   Mysterious           No
##      Q113992.fctr Q113583.fctr Q113584.fctr Q113181.fctr Q112478.fctr
## 2007          Yes           NA           NA           NA           NA
## 2084           No        Tunes   Technology           NA           NA
##      Q112512.fctr Q112270.fctr Q111848.fctr Q111580.fctr Q111220.fctr
## 2007           NA          Yes           No   Supportive           No
## 2084           NA           NA           NA           NA           NA
##      Q110740.fctr Q109367.fctr Q109244.fctr Q108950.fctr Q108855.fctr
## 2007           PC          Yes           No           NA         Yes!
## 2084           NA          Yes           No           NA           NA
##      Q108617.fctr Q108856.fctr Q108754.fctr Q108342.fctr Q108343.fctr
## 2007          Yes           NA           No       Online           No
## 2084           NA           NA           NA           NA           NA
##      Q107869.fctr Q107491.fctr Q106993.fctr Q106997.fctr Q106272.fctr
## 2007          Yes          Yes           NA           NA           NA
## 2084           NA          Yes          Yes           Gr          Yes
##      Q106388.fctr Q106389.fctr Q106042.fctr Q105840.fctr Q105655.fctr
## 2007           NA           NA           NA           NA           NA
## 2084           No           No          Yes           NA           No
##      Q104996.fctr Q103293.fctr Q102906.fctr Q102674.fctr Q102687.fctr
## 2007          Yes           NA           No          Yes          Yes
## 2084           No           No           NA          Yes           No
##      Q102289.fctr Q102089.fctr Q101162.fctr Q101163.fctr Q101596.fctr
## 2007          Yes         Rent     Optimist          Mom           No
## 2084           NA           NA           NA           NA           NA
##      Q100689.fctr Q100680.fctr Q100562.fctr Q100010.fctr Q99982.fctr
## 2007          Yes          Yes          Yes          Yes      Check!
## 2084           NA           NA           NA           NA          NA
##      Q99716.fctr Q99581.fctr Q99480.fctr Q98869.fctr Q98578.fctr
## 2007         Yes          No         Yes         Yes          NA
## 2084          NA          NA          NA          NA          NA
##      Q98197.fctr Q98059.fctr Q98078.fctr Q96024.fctr
## 2007          No         Yes          No         Yes
## 2084          NA          NA          NA          NA
## [1] "min distance(0.9574) pair:"
##      USER_ID Party.fctr Q115611.fctr Q124742.fctr Q124122.fctr
## 1423    1771          D           No           NA           NA
## 1923    2380          D           No           NA           NA
##      Q123621.fctr Q123464.fctr Q122771.fctr Q122770.fctr Q122769.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           NA           NA
##      Q122120.fctr Q121700.fctr Q121699.fctr Q121011.fctr Q120978.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           No          Yes
##      Q120650.fctr Q120472.fctr Q120379.fctr Q120194.fctr Q120014.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           NA           NA
##      Q120012.fctr Q119851.fctr Q119650.fctr Q119334.fctr Q118892.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           NA           NA
##      Q118237.fctr Q118233.fctr Q118232.fctr Q118117.fctr Q117193.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           NA           NA
##      Q117186.fctr Q116797.fctr Q116881.fctr Q116953.fctr Q116601.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           NA          Yes
##      Q116441.fctr Q116448.fctr Q116197.fctr Q115602.fctr Q115777.fctr
## 1423           NA           NA           NA          Yes        Start
## 1923          Yes           No         P.M.          Yes          End
##      Q115610.fctr Q115611.fctr.1 Q115899.fctr Q115390.fctr Q115195.fctr
## 1423          Yes             No           Cs          Yes          Yes
## 1923          Yes             No           Cs          Yes          Yes
##      Q114961.fctr Q114748.fctr Q114517.fctr Q114386.fctr Q114152.fctr
## 1423           No          Yes           No          TMI           NA
## 1923          Yes          Yes           No   Mysterious           No
##      Q113992.fctr Q113583.fctr Q113584.fctr Q113181.fctr Q112478.fctr
## 1423          Yes           NA           NA          Yes          Yes
## 1923           No        Tunes   Technology          Yes          Yes
##      Q112512.fctr Q112270.fctr Q111848.fctr Q111580.fctr Q111220.fctr
## 1423           NA           No          Yes   Supportive           NA
## 1923          Yes           No          Yes   Supportive           No
##      Q110740.fctr Q109367.fctr Q109244.fctr Q108950.fctr Q108855.fctr
## 1423           NA          Yes           No           NA           NA
## 1923          Mac          Yes           No     Cautious       Umm...
##      Q108617.fctr Q108856.fctr Q108754.fctr Q108342.fctr Q108343.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           No        Space           No    In-person          Yes
##      Q107869.fctr Q107491.fctr Q106993.fctr Q106997.fctr Q106272.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           No          Yes          Yes           Yy          Yes
##      Q106388.fctr Q106389.fctr Q106042.fctr Q105840.fctr Q105655.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           No           No           NA           NA           NA
##      Q104996.fctr Q103293.fctr Q102906.fctr Q102674.fctr Q102687.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           No           NA           NA           NA           NA
##      Q102289.fctr Q102089.fctr Q101162.fctr Q101163.fctr Q101596.fctr
## 1423           NA           NA           NA           NA           NA
## 1923           NA           NA           NA           NA           NA
##      Q100689.fctr Q100680.fctr Q100562.fctr Q100010.fctr Q99982.fctr
## 1423           NA           NA           NA           NA          NA
## 1923           NA           NA           NA           NA          NA
##      Q99716.fctr Q99581.fctr Q99480.fctr Q98869.fctr Q98578.fctr
## 1423          NA          NA          NA          NA          NA
## 1923          NA          NA          NA          NA          NA
##      Q98197.fctr Q98059.fctr Q98078.fctr Q96024.fctr
## 1423          NA          NA          NA          NA
## 1923          NA          NA          NA         Yes
## [1] "Category: Yes"
## [1] "max distance(0.9685) pair:"
##      USER_ID Party.fctr Q115611.fctr Q124742.fctr Q124122.fctr
## 5702     685       <NA>          Yes           No          Yes
## 6669    5550       <NA>          Yes           No          Yes
##      Q123621.fctr Q123464.fctr Q122771.fctr Q122770.fctr Q122769.fctr
## 5702          Yes           No           Pt          Yes          Yes
## 6669           NA           NA           NA           NA           NA
##      Q122120.fctr Q121700.fctr Q121699.fctr Q121011.fctr Q120978.fctr
## 5702          Yes           No          Yes          Yes          Yes
## 6669           No           No          Yes           No          Yes
##      Q120650.fctr Q120472.fctr Q120379.fctr Q120194.fctr Q120014.fctr
## 5702          Yes           NA           NA           NA          Yes
## 6669           No      Science          Yes    Try first          Yes
##      Q120012.fctr Q119851.fctr Q119650.fctr Q119334.fctr Q118892.fctr
## 5702           No           No       Giving          Yes           No
## 6669          Yes           No       Giving           No          Yes
##      Q118237.fctr Q118233.fctr Q118232.fctr Q118117.fctr   Q117193.fctr
## 5702           No           No           Id           No Standard hours
## 6669           No           No           Id           No Standard hours
##      Q117186.fctr Q116797.fctr Q116881.fctr Q116953.fctr Q116601.fctr
## 5702  Cool headed           No        Right           No          Yes
## 6669  Cool headed          Yes        Happy           No          Yes
##      Q116441.fctr Q116448.fctr Q116197.fctr Q115602.fctr Q115777.fctr
## 5702          Yes           No         P.M.          Yes          End
## 6669           No           No         A.M.          Yes          End
##      Q115610.fctr Q115611.fctr.1 Q115899.fctr Q115390.fctr Q115195.fctr
## 5702          Yes            Yes           Me           No          Yes
## 6669          Yes            Yes           Cs          Yes          Yes
##      Q114961.fctr Q114748.fctr Q114517.fctr Q114386.fctr Q114152.fctr
## 5702           NA           NA           NA           NA           No
## 6669          Yes          Yes           No          TMI          Yes
##      Q113992.fctr Q113583.fctr Q113584.fctr Q113181.fctr Q112478.fctr
## 5702          Yes        Tunes   Technology           NA          Yes
## 6669           No         Talk   Technology           No           No
##      Q112512.fctr Q112270.fctr Q111848.fctr Q111580.fctr Q111220.fctr
## 5702          Yes          Yes           NA           NA           No
## 6669           No          Yes          Yes    Demanding           No
##      Q110740.fctr Q109367.fctr Q109244.fctr  Q108950.fctr Q108855.fctr
## 5702          Mac           No           No      Cautious           NA
## 6669           PC          Yes           No Risk-friendly         Yes!
##      Q108617.fctr Q108856.fctr Q108754.fctr Q108342.fctr Q108343.fctr
## 5702           NA           NA           NA       Online          Yes
## 6669           No        Space           No    In-person          Yes
##      Q107869.fctr Q107491.fctr Q106993.fctr Q106997.fctr Q106272.fctr
## 5702           NA           NA           NA           NA           NA
## 6669          Yes          Yes          Yes           Yy          Yes
##      Q106388.fctr Q106389.fctr Q106042.fctr Q105840.fctr Q105655.fctr
## 5702           NA           NA           NA           NA          Yes
## 6669           No          Yes           No          Yes           No
##      Q104996.fctr Q103293.fctr Q102906.fctr Q102674.fctr Q102687.fctr
## 5702          Yes          Yes          Yes           No           No
## 6669          Yes          Yes           No          Yes          Yes
##      Q102289.fctr Q102089.fctr Q101162.fctr Q101163.fctr Q101596.fctr
## 5702          Yes           NA     Optimist          Mom          Yes
## 6669          Yes          Own     Optimist          Dad           No
##      Q100689.fctr Q100680.fctr Q100562.fctr Q100010.fctr Q99982.fctr
## 5702           No          Yes          Yes          Yes      Check!
## 6669           No          Yes          Yes           No      Check!
##      Q99716.fctr Q99581.fctr Q99480.fctr Q98869.fctr Q98578.fctr
## 5702          NA          No         Yes         Yes          No
## 6669          No          No         Yes          No         Yes
##      Q98197.fctr Q98059.fctr Q98078.fctr Q96024.fctr
## 5702         Yes         Yes          No         Yes
## 6669          NA          NA          NA         Yes
## [1] "min distance(0.9501) pair:"
##      USER_ID Party.fctr Q115611.fctr Q124742.fctr Q124122.fctr
## 4299    5365          R          Yes           NA           NA
## 6802    6188       <NA>          Yes           NA           NA
##      Q123621.fctr Q123464.fctr Q122771.fctr Q122770.fctr Q122769.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q122120.fctr Q121700.fctr Q121699.fctr Q121011.fctr Q120978.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q120650.fctr Q120472.fctr Q120379.fctr Q120194.fctr Q120014.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q120012.fctr Q119851.fctr Q119650.fctr Q119334.fctr Q118892.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q118237.fctr Q118233.fctr Q118232.fctr Q118117.fctr Q117193.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q117186.fctr Q116797.fctr Q116881.fctr Q116953.fctr Q116601.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q116441.fctr Q116448.fctr Q116197.fctr Q115602.fctr Q115777.fctr
## 4299           NA           NA           NA          Yes           NA
## 6802           NA           NA           NA          Yes          End
##      Q115610.fctr Q115611.fctr.1 Q115899.fctr Q115390.fctr Q115195.fctr
## 4299          Yes            Yes           NA          Yes          Yes
## 6802          Yes            Yes           NA          Yes          Yes
##      Q114961.fctr Q114748.fctr Q114517.fctr Q114386.fctr Q114152.fctr
## 4299           No           No          Yes   Mysterious          Yes
## 6802           NA           No           No   Mysterious           No
##      Q113992.fctr Q113583.fctr Q113584.fctr Q113181.fctr Q112478.fctr
## 4299           No         Talk       People          Yes           NA
## 6802          Yes           NA   Technology           No           NA
##      Q112512.fctr Q112270.fctr Q111848.fctr Q111580.fctr Q111220.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q110740.fctr Q109367.fctr Q109244.fctr  Q108950.fctr Q108855.fctr
## 4299           NA          Yes           No            NA           NA
## 6802           NA          Yes           No Risk-friendly       Umm...
##      Q108617.fctr Q108856.fctr Q108754.fctr Q108342.fctr Q108343.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           No        Space          Yes    In-person          Yes
##      Q107869.fctr Q107491.fctr Q106993.fctr Q106997.fctr Q106272.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           No          Yes          Yes           Yy           NA
##      Q106388.fctr Q106389.fctr Q106042.fctr Q105840.fctr Q105655.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q104996.fctr Q103293.fctr Q102906.fctr Q102674.fctr Q102687.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q102289.fctr Q102089.fctr Q101162.fctr Q101163.fctr Q101596.fctr
## 4299           NA           NA           NA           NA           NA
## 6802           NA           NA           NA           NA           NA
##      Q100689.fctr Q100680.fctr Q100562.fctr Q100010.fctr Q99982.fctr
## 4299           NA           NA           NA           NA          NA
## 6802           NA           NA           NA           NA          NA
##      Q99716.fctr Q99581.fctr Q99480.fctr Q98869.fctr Q98578.fctr
## 4299          NA          NA          NA          NA          NA
## 6802          NA          NA          NA          NA          NA
##      Q98197.fctr Q98059.fctr Q98078.fctr Q96024.fctr
## 4299          NA          NA          NA          NA
## 6802          NA          NA          NA          NA
##    Q115611.fctr .clusterid Q115611.fctr.clusterid   D   R  .entropy .knt
## 1            NA          1                   NA_1  43  54 0.6867033   97
## 2            NA          2                   NA_2  38  45 0.6895866   83
## 3            NA          3                   NA_3  45  42 0.6925525   87
## 4            NA          4                   NA_4  28  42 0.6730117   70
## 5            NA          5                   NA_5  13  30 0.6128219   43
## 6            No          1                   No_1 390 327 0.6892820  717
## 7            No          2                   No_2  94 179 0.6438609  273
## 8            No          3                   No_3  95 108 0.6910953  203
## 9           Yes          1                  Yes_1 180 316 0.6550703  496
## 10          Yes          2                  Yes_2  82 214 0.5901235  296
## 11          Yes          3                  Yes_3  30  64 0.6262263   94
## [1] "glbObsAll$Q115611.fctr$.clusterid Entropy: 0.6614 (98.6455 pct)"
##                      label step_major step_minor label_minor    bgn    end
## 11            cluster.data          4          0           0  8.543 55.627
## 12 partition.data.training          5          0           0 55.627     NA
##    elapsed
## 11  47.084
## 12      NA

Step 5.0: partition data training

## [1] "partition.data.training chunk: setup: elapsed: 0.00 secs"
## Loading required package: sqldf
## Loading required package: gsubfn
## Loading required package: proto
## Loading required package: RSQLite
## Loading required package: DBI
## Loading required package: tcltk
## Loading required package: reshape2
## [1] "partition.data.training chunk: strata_mtrx complete: elapsed: 2.50 secs"
## [1] "partition.data.training chunk: obs_freq_df complete: elapsed: 2.50 secs"
## [1] "lclgetMatrixSimilarity: duration: 15.836000 secs"
## Loading required package: sampling
## 
## Attaching package: 'sampling'
## The following object is masked from 'package:caret':
## 
##     cluster
## Stratum 1 
## 
## Population total and number of selected units: 167 36 
## Stratum 2 
## 
## Population total and number of selected units: 579 92 
## Stratum 3 
## 
## Population total and number of selected units: 292 81 
## Stratum 4 
## 
## Population total and number of selected units: 213 49 
## Stratum 5 
## 
## Population total and number of selected units: 614 126 
## Stratum 6 
## 
## Population total and number of selected units: 594 111 
## Number of strata  6 
## Total number of selected units 495 
## [1] "lclgetMatrixSimilarity: duration: 11.545000 secs"
## [1] "lclgetMatrixSimilarity: duration: 4.127000 secs"
## [1] "lclgetMatrixSimilarity: duration: 3.826000 secs"
## [1] "lclgetMatrixSimilarity: duration: 9.805000 secs"

## [1] "Similarity of partitions:"
##         cor cosineSmy obs.x obs.y
## 1 0.9999877 0.9506489   OOB   Fit
## 2 0.9999876 0.9522998   OOB   New
## 3 0.9999878 0.9074462   Fit   New
## [1] "partition.data.training chunk: Fit/OOB partition complete: elapsed: 48.88 secs"
##     Party.Democrat Party.Republican Party.NA
##                 NA               NA      622
## Fit            829             1135       NA
## OOB            209              286       NA
##     Party.Democrat Party.Republican Party.NA
##                 NA               NA        1
## Fit      0.4220978        0.5779022       NA
## OOB      0.4222222        0.5777778       NA
##   Q115611.fctr .n.Fit .n.OOB .n.Tst .freqRatio.Fit .freqRatio.OOB
## 2           No    975    218    274      0.4964358      0.4404040
## 3          Yes    694    192    241      0.3533605      0.3878788
## 1           NA    295     85    107      0.1502037      0.1717172
##   .freqRatio.Tst
## 2      0.4405145
## 3      0.3874598
## 1      0.1720257
## [1] "glbObsAll: "
## [1] 3081  222
## [1] "glbObsTrn: "
## [1] 2459  222
## [1] "glbObsFit: "
## [1] 1964  221
## [1] "glbObsOOB: "
## [1] 495 221
## [1] "glbObsNew: "
## [1] 622 221
## [1] "partition.data.training chunk: teardown: elapsed: 49.51 secs"
##                      label step_major step_minor label_minor     bgn
## 12 partition.data.training          5          0           0  55.627
## 13         select.features          6          0           0 105.202
##        end elapsed
## 12 105.201  49.574
## 13      NA      NA

Step 6.0: select features

```{r select.features, cache=FALSE, echo=FALSE, eval=myevlChunk(glbChunks, glbOut$pfx)}

## Warning in cor(data.matrix(entity_df[, sel_feats]), y =
## as.numeric(entity_df[, : the standard deviation is zero
##                         cor.y exclude.as.feat    cor.y.abs cor.high.X
## Q115611.fctr     0.1061227024               0 0.1061227024         NA
## Q113181.fctr     0.0984260825               0 0.0984260825         NA
## Q98197.fctr      0.0740068938               0 0.0740068938         NA
## .clusterid       0.0617189085               1 0.0617189085         NA
## .clusterid.fctr  0.0617189085               0 0.0617189085         NA
## Q116881.fctr     0.0562595864               0 0.0562595864         NA
## Q108855.fctr     0.0552557062               0 0.0552557062         NA
## Q106272.fctr     0.0508423733               0 0.0508423733         NA
## Q122771.fctr     0.0506936997               0 0.0506936997         NA
## Q123621.fctr     0.0496371444               0 0.0496371444         NA
## Q106388.fctr     0.0494091927               0 0.0494091927         NA
## Q110740.fctr     0.0475251274               0 0.0475251274         NA
## USER_ID          0.0451253514               1 0.0451253514         NA
## .pos             0.0449751220               1 0.0449751220         NA
## Q122769.fctr     0.0357238967               0 0.0357238967         NA
## Q120472.fctr     0.0353553175               0 0.0353553175         NA
## Q101596.fctr     0.0310707949               0 0.0310707949         NA
## Q119334.fctr     0.0307545744               0 0.0307545744         NA
## Q114152.fctr     0.0298031360               0 0.0298031360         NA
## Q98869.fctr      0.0279837842               0 0.0279837842         NA
## Q115899.fctr     0.0275355442               0 0.0275355442         NA
## Q116797.fctr     0.0264023055               0 0.0264023055         NA
## YOB.Age.dff      0.0263896332               0 0.0263896332         NA
## Q118232.fctr     0.0263557469               0 0.0263557469         NA
## Gender.fctr      0.0260785749               0 0.0260785749         NA
## Q105655.fctr     0.0254502518               0 0.0254502518         NA
## Q99480.fctr      0.0241968063               0 0.0241968063         NA
## Q123464.fctr     0.0232487747               0 0.0232487747         NA
## Q120650.fctr     0.0228127973               0 0.0228127973         NA
## Q122120.fctr     0.0226963845               0 0.0226963845         NA
## Q107869.fctr     0.0218682393               0 0.0218682393         NA
## Q120014.fctr     0.0202731591               0 0.0202731591         NA
## Q102289.fctr     0.0192468615               0 0.0192468615         NA
## Income.fctr      0.0179840418               0 0.0179840418         NA
## Q122770.fctr     0.0174735493               0 0.0174735493         NA
## Q111580.fctr     0.0170269417               0 0.0170269417         NA
## Q116601.fctr     0.0159184435               0 0.0159184435         NA
## Q117186.fctr     0.0158641235               0 0.0158641235         NA
## Q106993.fctr     0.0151551332               0 0.0151551332         NA
## Q112270.fctr     0.0147892685               0 0.0147892685         NA
## Q101162.fctr     0.0139326027               0 0.0139326027         NA
## Q108856.fctr     0.0128750759               0 0.0128750759         NA
## Q117193.fctr     0.0114974111               0 0.0114974111         NA
## Q116441.fctr     0.0093463969               0 0.0093463969         NA
## Q119851.fctr     0.0089549525               0 0.0089549525         NA
## Q111848.fctr     0.0085442819               0 0.0085442819         NA
## Q98578.fctr      0.0067135887               0 0.0067135887         NA
## Q118892.fctr     0.0063006467               0 0.0063006467         NA
## Q114386.fctr     0.0057240993               0 0.0057240993         NA
## Q120978.fctr     0.0055115231               0 0.0055115231         NA
## Q112512.fctr     0.0053167658               0 0.0053167658         NA
## Q102674.fctr     0.0050627208               0 0.0050627208         NA
## Q96024.fctr      0.0040534729               0 0.0040534729         NA
## Q108950.fctr     0.0039412433               0 0.0039412433         NA
## Q115610.fctr     0.0037395055               0 0.0037395055         NA
## YOB.Age.fctr     0.0031160191               0 0.0031160191         NA
## Q112478.fctr     0.0028932765               0 0.0028932765         NA
## Q116197.fctr     0.0026906806               0 0.0026906806         NA
## Q124742.fctr     0.0025316261               0 0.0025316261         NA
## Q106389.fctr     0.0020182255               0 0.0020182255         NA
## Edn.fctr         0.0013569584               0 0.0013569584         NA
## Q118117.fctr     0.0006385446               0 0.0006385446         NA
## Q100562.fctr     0.0001743827               0 0.0001743827         NA
## Q107491.fctr    -0.0001103153               0 0.0001103153         NA
## Q116448.fctr    -0.0023584430               0 0.0023584430         NA
## Q108754.fctr    -0.0027742157               0 0.0027742157         NA
## Q116953.fctr    -0.0029373549               0 0.0029373549         NA
## Q115602.fctr    -0.0031238519               0 0.0031238519         NA
## Q118233.fctr    -0.0033273008               0 0.0033273008         NA
## Q120012.fctr    -0.0039513241               0 0.0039513241         NA
## Q118237.fctr    -0.0043335513               0 0.0043335513         NA
## Q99581.fctr     -0.0046486977               0 0.0046486977         NA
## .rnorm          -0.0048723001               0 0.0048723001         NA
## Q120194.fctr    -0.0057263432               0 0.0057263432         NA
## Q115777.fctr    -0.0059804934               0 0.0059804934         NA
## Q106997.fctr    -0.0063914109               0 0.0063914109         NA
## Q100680.fctr    -0.0072431931               0 0.0072431931         NA
## Q113584.fctr    -0.0076436688               0 0.0076436688         NA
## Q108343.fctr    -0.0079333386               0 0.0079333386         NA
## Q121700.fctr    -0.0087942115               0 0.0087942115         NA
## Q105840.fctr    -0.0088034036               0 0.0088034036         NA
## Q120379.fctr    -0.0089842116               0 0.0089842116         NA
## Q103293.fctr    -0.0090167793               0 0.0090167793         NA
## Q124122.fctr    -0.0099503887               0 0.0099503887         NA
## Q109367.fctr    -0.0100116070               0 0.0100116070         NA
## Q113992.fctr    -0.0100378101               0 0.0100378101         NA
## Q121699.fctr    -0.0121369662               0 0.0121369662         NA
## Q121011.fctr    -0.0122186222               0 0.0122186222         NA
## Q114748.fctr    -0.0128363203               0 0.0128363203         NA
## Q106042.fctr    -0.0135167901               0 0.0135167901         NA
## Q111220.fctr    -0.0145971279               0 0.0145971279         NA
## Q114517.fctr    -0.0148538356               0 0.0148538356         NA
## YOB             -0.0169432580               1 0.0169432580         NA
## Q102687.fctr    -0.0169904229               0 0.0169904229         NA
## Q102906.fctr    -0.0173704239               0 0.0173704239         NA
## Q98078.fctr     -0.0177772661               0 0.0177772661         NA
## Q115390.fctr    -0.0196547694               0 0.0196547694         NA
## Q102089.fctr    -0.0200451075               0 0.0200451075         NA
## Q100010.fctr    -0.0208031518               0 0.0208031518         NA
## Q99982.fctr     -0.0208604939               0 0.0208604939         NA
## Q113583.fctr    -0.0211296876               0 0.0211296876         NA
## Q108342.fctr    -0.0211946324               0 0.0211946324         NA
## Q104996.fctr    -0.0218776356               0 0.0218776356         NA
## Q119650.fctr    -0.0222628005               0 0.0222628005         NA
## Q100689.fctr    -0.0263249102               0 0.0263249102         NA
## Q108617.fctr    -0.0285334447               0 0.0285334447         NA
## Q115195.fctr    -0.0295831061               0 0.0295831061         NA
## Q99716.fctr     -0.0333178411               0 0.0333178411         NA
## Q101163.fctr    -0.0349739760               0 0.0349739760         NA
## Q98059.fctr     -0.0354482758               0 0.0354482758         NA
## Q114961.fctr    -0.0396043459               0 0.0396043459         NA
## Hhold.fctr      -0.0644984804               0 0.0644984804         NA
## Q109244.fctr               NA               0           NA         NA
##                 freqRatio percentUnique zeroVar   nzv is.cor.y.abs.low
## Q115611.fctr     1.346501    0.12200081   FALSE FALSE            FALSE
## Q113181.fctr     1.207806    0.12200081   FALSE FALSE            FALSE
## Q98197.fctr      1.293258    0.12200081   FALSE FALSE            FALSE
## .clusterid       2.009202    0.20333469   FALSE FALSE            FALSE
## .clusterid.fctr  2.009202    0.20333469   FALSE FALSE            FALSE
## Q116881.fctr     2.244592    0.12200081   FALSE FALSE            FALSE
## Q108855.fctr     1.511876    0.12200081   FALSE FALSE            FALSE
## Q106272.fctr     2.712785    0.12200081   FALSE FALSE            FALSE
## Q122771.fctr     3.656388    0.12200081   FALSE FALSE            FALSE
## Q123621.fctr     1.144186    0.12200081   FALSE FALSE            FALSE
## Q106388.fctr     2.478261    0.12200081   FALSE FALSE            FALSE
## Q110740.fctr     1.491409    0.12200081   FALSE FALSE            FALSE
## USER_ID          1.000000  100.00000000   FALSE FALSE            FALSE
## .pos             1.000000  100.00000000   FALSE FALSE            FALSE
## Q122769.fctr     1.693260    0.12200081   FALSE FALSE            FALSE
## Q120472.fctr     2.803119    0.12200081   FALSE FALSE            FALSE
## Q101596.fctr     1.737877    0.12200081   FALSE FALSE            FALSE
## Q119334.fctr     1.125395    0.12200081   FALSE FALSE            FALSE
## Q114152.fctr     2.270531    0.12200081   FALSE FALSE            FALSE
## Q98869.fctr      3.614191    0.12200081   FALSE FALSE            FALSE
## Q115899.fctr     1.448234    0.12200081   FALSE FALSE            FALSE
## Q116797.fctr     2.093023    0.12200081   FALSE FALSE            FALSE
## YOB.Age.dff      1.021687    0.73200488   FALSE FALSE            FALSE
## Q118232.fctr     1.307600    0.12200081   FALSE FALSE            FALSE
## Gender.fctr      2.405063    0.12200081   FALSE FALSE            FALSE
## Q105655.fctr     1.274947    0.12200081   FALSE FALSE            FALSE
## Q99480.fctr      4.095588    0.12200081   FALSE FALSE            FALSE
## Q123464.fctr     3.190731    0.12200081   FALSE FALSE            FALSE
## Q120650.fctr     3.544379    0.12200081   FALSE FALSE            FALSE
## Q122120.fctr     3.098563    0.12200081   FALSE FALSE            FALSE
## Q107869.fctr     1.293617    0.12200081   FALSE FALSE            FALSE
## Q120014.fctr     1.623324    0.12200081   FALSE FALSE            FALSE
## Q102289.fctr     2.286846    0.12200081   FALSE FALSE            FALSE
## Income.fctr      1.026596    0.28466856   FALSE FALSE            FALSE
## Q122770.fctr     1.407720    0.12200081   FALSE FALSE            FALSE
## Q111580.fctr     1.934936    0.12200081   FALSE FALSE            FALSE
## Q116601.fctr     3.948235    0.12200081   FALSE FALSE            FALSE
## Q117186.fctr     1.761702    0.12200081   FALSE FALSE            FALSE
## Q106993.fctr     4.903581    0.12200081   FALSE FALSE            FALSE
## Q112270.fctr     1.134313    0.12200081   FALSE FALSE            FALSE
## Q101162.fctr     1.556382    0.12200081   FALSE FALSE            FALSE
## Q108856.fctr     2.251156    0.12200081   FALSE FALSE            FALSE
## Q117193.fctr     1.346618    0.12200081   FALSE FALSE            FALSE
## Q116441.fctr     1.638601    0.12200081   FALSE FALSE            FALSE
## Q119851.fctr     1.575758    0.12200081   FALSE FALSE            FALSE
## Q111848.fctr     1.402247    0.12200081   FALSE FALSE            FALSE
## Q98578.fctr      1.717158    0.12200081   FALSE FALSE            FALSE
## Q118892.fctr     1.411356    0.12200081   FALSE FALSE            FALSE
## Q114386.fctr     1.426366    0.12200081   FALSE FALSE            FALSE
## Q120978.fctr     1.264000    0.12200081   FALSE FALSE            FALSE
## Q112512.fctr     4.175743    0.12200081   FALSE FALSE            FALSE
## Q102674.fctr     1.849030    0.12200081   FALSE FALSE            FALSE
## Q96024.fctr      1.683444    0.12200081   FALSE FALSE             TRUE
## Q108950.fctr     2.087366    0.12200081   FALSE FALSE             TRUE
## Q115610.fctr     3.966825    0.12200081   FALSE FALSE             TRUE
## YOB.Age.fctr     1.180000    0.36600244   FALSE FALSE             TRUE
## Q112478.fctr     1.459880    0.12200081   FALSE FALSE             TRUE
## Q116197.fctr     2.031484    0.12200081   FALSE FALSE             TRUE
## Q124742.fctr     1.322468    0.12200081   FALSE FALSE             TRUE
## Q106389.fctr     1.086957    0.12200081   FALSE FALSE             TRUE
## Edn.fctr         1.694524    0.32533550   FALSE FALSE             TRUE
## Q118117.fctr     1.402074    0.12200081   FALSE FALSE             TRUE
## Q100562.fctr     4.329082    0.12200081   FALSE FALSE             TRUE
## Q107491.fctr     6.288591    0.12200081   FALSE FALSE             TRUE
## Q116448.fctr     1.352326    0.12200081   FALSE FALSE             TRUE
## Q108754.fctr     2.045897    0.12200081   FALSE FALSE             TRUE
## Q116953.fctr     1.978788    0.12200081   FALSE FALSE             TRUE
## Q115602.fctr     3.864608    0.12200081   FALSE FALSE             TRUE
## Q118233.fctr     2.654676    0.12200081   FALSE FALSE             TRUE
## Q120012.fctr     1.344262    0.12200081   FALSE FALSE             TRUE
## Q118237.fctr     1.429419    0.12200081   FALSE FALSE             TRUE
## Q99581.fctr      4.989011    0.12200081   FALSE FALSE             TRUE
## .rnorm           1.000000  100.00000000   FALSE FALSE            FALSE
## Q120194.fctr     1.317536    0.12200081   FALSE FALSE            FALSE
## Q115777.fctr     1.387173    0.12200081   FALSE FALSE            FALSE
## Q106997.fctr     1.187117    0.12200081   FALSE FALSE            FALSE
## Q100680.fctr     1.250542    0.12200081   FALSE FALSE            FALSE
## Q113584.fctr     1.008824    0.12200081   FALSE FALSE            FALSE
## Q108343.fctr     1.565774    0.12200081   FALSE FALSE            FALSE
## Q121700.fctr     3.968468    0.12200081   FALSE FALSE            FALSE
## Q105840.fctr     1.486260    0.12200081   FALSE FALSE            FALSE
## Q120379.fctr     1.447596    0.12200081   FALSE FALSE            FALSE
## Q103293.fctr     1.252910    0.12200081   FALSE FALSE            FALSE
## Q124122.fctr     1.187500    0.12200081   FALSE FALSE            FALSE
## Q109367.fctr     1.460805    0.12200081   FALSE FALSE            FALSE
## Q113992.fctr     2.253086    0.12200081   FALSE FALSE            FALSE
## Q121699.fctr     2.600355    0.12200081   FALSE FALSE            FALSE
## Q121011.fctr     1.158969    0.12200081   FALSE FALSE            FALSE
## Q114748.fctr     1.344482    0.12200081   FALSE FALSE            FALSE
## Q106042.fctr     1.319690    0.12200081   FALSE FALSE            FALSE
## Q111220.fctr     3.014898    0.12200081   FALSE FALSE            FALSE
## Q114517.fctr     2.182796    0.12200081   FALSE FALSE            FALSE
## YOB              1.105769    2.92801952   FALSE FALSE            FALSE
## Q102687.fctr     1.004803    0.12200081   FALSE FALSE            FALSE
## Q102906.fctr     1.950213    0.12200081   FALSE FALSE            FALSE
## Q98078.fctr      1.590206    0.12200081   FALSE FALSE            FALSE
## Q115390.fctr     1.460576    0.12200081   FALSE FALSE            FALSE
## Q102089.fctr     2.381260    0.12200081   FALSE FALSE            FALSE
## Q100010.fctr     3.962529    0.12200081   FALSE FALSE            FALSE
## Q99982.fctr      1.107143    0.12200081   FALSE FALSE            FALSE
## Q113583.fctr     1.881690    0.12200081   FALSE FALSE            FALSE
## Q108342.fctr     2.426563    0.12200081   FALSE FALSE            FALSE
## Q104996.fctr     1.032350    0.12200081   FALSE FALSE            FALSE
## Q119650.fctr     3.108830    0.12200081   FALSE FALSE            FALSE
## Q100689.fctr     1.440137    0.12200081   FALSE FALSE            FALSE
## Q108617.fctr     7.888446    0.12200081   FALSE FALSE            FALSE
## Q115195.fctr     1.713147    0.12200081   FALSE FALSE            FALSE
## Q99716.fctr      4.852332    0.12200081   FALSE FALSE            FALSE
## Q101163.fctr     1.437576    0.12200081   FALSE FALSE            FALSE
## Q98059.fctr      5.379888    0.12200081   FALSE FALSE            FALSE
## Q114961.fctr     1.029000    0.12200081   FALSE FALSE            FALSE
## Hhold.fctr       1.209330    0.28466856   FALSE FALSE            FALSE
## Q109244.fctr     0.000000    0.04066694    TRUE  TRUE               NA
## Warning in myplot_scatter(plt_feats_df, "percentUnique", "freqRatio",
## colorcol_name = "nzv", : converting nzv to class:factor
## Warning: Removed 3 rows containing missing values (geom_point).

## Warning: Removed 3 rows containing missing values (geom_point).

## Warning: Removed 3 rows containing missing values (geom_point).

##              cor.y exclude.as.feat cor.y.abs cor.high.X freqRatio
## Q109244.fctr    NA               0        NA         NA         0
##              percentUnique zeroVar  nzv is.cor.y.abs.low
## Q109244.fctr    0.04066694    TRUE TRUE               NA
## Scale for 'y' is already present. Adding another scale for 'y', which
## will replace the existing scale.

## [1] "numeric data missing in : "
##        YOB Party.fctr 
##        128        622 
## [1] "numeric data w/ 0s in : "
## YOB.Age.dff 
##         136 
## [1] "numeric data w/ Infs in : "
## named integer(0)
## [1] "numeric data w/ NaNs in : "
## named integer(0)
## [1] "string data missing in : "
##          Gender          Income HouseholdStatus  EducationLevel 
##              46             445             177             410 
##           Party         Q124742         Q124122         Q123464 
##              NA            1438             823             708 
##         Q123621         Q122769         Q122770         Q122771 
##             778             644             594             587 
##         Q122120         Q121699         Q121700         Q120978 
##             585             547             563             599 
##         Q121011         Q120379         Q120650         Q120472 
##             571             607             655             649 
##         Q120194         Q120012         Q120014         Q119334 
##             654             591             641             568 
##         Q119851         Q119650         Q118892         Q118117 
##             540             578             486             479 
##         Q118232         Q118233         Q118237         Q117186 
##             701             554             539             648 
##         Q117193         Q116797         Q116881         Q116953 
##             655             590             635             616 
##         Q116601         Q116441         Q116448         Q116197 
##             534             541             560             551 
##         Q115602         Q115777         Q115610         Q115611 
##             539             578             537             487 
##         Q115899         Q115390         Q114961         Q114748 
##             573             619             538             447 
##         Q115195         Q114517         Q114386         Q113992 
##             525             481             521             447 
##         Q114152         Q113583         Q113584         Q113181 
##             537             514             512             453 
##         Q112478         Q112512         Q112270         Q111848 
##             494             460             521             398 
##         Q111580         Q111220         Q110740         Q109367 
##             474             379             357             168 
##         Q108950         Q109244         Q108855         Q108617 
##             204               0             438             288 
##         Q108856         Q108754         Q108342         Q108343 
##             436             338             341             333 
##         Q107869         Q107491         Q106993         Q106997 
##             389             366             389             396 
##         Q106272         Q106388         Q106389         Q106042 
##             426             476             495             451 
##         Q105840         Q105655         Q104996         Q103293 
##             487             393             400             431 
##         Q102906         Q102674         Q102687         Q102289 
##             493             511             475             484 
##         Q102089         Q101162         Q101163         Q101596 
##             462             498             572             477 
##         Q100689         Q100680         Q100562          Q99982 
##             414             497             487             514 
##         Q100010          Q99716          Q99581          Q99480 
##             445             500             466             478 
##          Q98869          Q98578          Q98059          Q98078 
##             564             542             450             569 
##          Q98197          Q96024            .lcn 
##             528             550             622
## [1] "glb_feats_df:"
## [1] 113  12
##                    id exclude.as.feat rsp_var
## Party.fctr Party.fctr            TRUE    TRUE
##                    id      cor.y exclude.as.feat  cor.y.abs cor.high.X
## USER_ID       USER_ID 0.04512535            TRUE 0.04512535         NA
## Party.fctr Party.fctr         NA            TRUE         NA         NA
##            freqRatio percentUnique zeroVar   nzv is.cor.y.abs.low
## USER_ID            1           100   FALSE FALSE            FALSE
## Party.fctr        NA            NA      NA    NA               NA
##            interaction.feat shapiro.test.p.value rsp_var_raw id_var
## USER_ID                <NA>                   NA       FALSE   TRUE
## Party.fctr             <NA>                   NA          NA     NA
##            rsp_var
## USER_ID         NA
## Party.fctr    TRUE
## [1] "glb_feats_df vs. glbObsAll: "
## character(0)
## [1] "glbObsAll vs. glb_feats_df: "
## character(0)
##              label step_major step_minor label_minor     bgn     end
## 13 select.features          6          0           0 105.202 108.106
## 14      fit.models          7          0           0 108.107      NA
##    elapsed
## 13   2.904
## 14      NA

Step 7.0: fit models

fit.models_0_chunk_df <- myadd_chunk(NULL, "fit.models_0_bgn", label.minor = "setup")
##              label step_major step_minor label_minor     bgn end elapsed
## 1 fit.models_0_bgn          1          0       setup 108.647  NA      NA
# load(paste0(glbOut$pfx, "dsk.RData"))

glbgetModelSelectFormula <- function() {
    model_evl_terms <- c(NULL)
    # min.aic.fit might not be avl
    lclMdlEvlCriteria <- 
        glbMdlMetricsEval[glbMdlMetricsEval %in% names(glb_models_df)]
    for (metric in lclMdlEvlCriteria)
        model_evl_terms <- c(model_evl_terms, 
                             ifelse(length(grep("max", metric)) > 0, "-", "+"), metric)
    if (glb_is_classification && glb_is_binomial)
        model_evl_terms <- c(model_evl_terms, "-", "opt.prob.threshold.OOB")
    model_sel_frmla <- as.formula(paste(c("~ ", model_evl_terms), collapse = " "))
    return(model_sel_frmla)
}

glbgetDisplayModelsDf <- function() {
    dsp_models_cols <- c("id", 
                    glbMdlMetricsEval[glbMdlMetricsEval %in% names(glb_models_df)],
                    grep("opt.", names(glb_models_df), fixed = TRUE, value = TRUE)) 
    dsp_models_df <- 
        #orderBy(glbgetModelSelectFormula(), glb_models_df)[, c("id", glbMdlMetricsEval)]
        orderBy(glbgetModelSelectFormula(), glb_models_df)[, dsp_models_cols]    
    nCvMdl <- sapply(glb_models_lst, function(mdl) nrow(mdl$results))
    nParams <- sapply(glb_models_lst, function(mdl) ifelse(mdl$method == "custom", 0, 
        nrow(subset(modelLookup(mdl$method), parameter != "parameter"))))
    
#     nCvMdl <- nCvMdl[names(nCvMdl) != "avNNet"]
#     nParams <- nParams[names(nParams) != "avNNet"]    
    
    if (length(cvMdlProblems <- nCvMdl[nCvMdl <= nParams]) > 0) {
        print("Cross Validation issues:")
        warning("Cross Validation issues:")        
        print(cvMdlProblems)
    }
    
    pltMdls <- setdiff(names(nCvMdl), names(cvMdlProblems))
    pltMdls <- setdiff(pltMdls, names(nParams[nParams == 0]))
    
    # length(pltMdls) == 21
    png(paste0(glbOut$pfx, "bestTune.png"), width = 480 * 2, height = 480 * 4)
    grid.newpage()
    pushViewport(viewport(layout = grid.layout(ceiling(length(pltMdls) / 2.0), 2)))
    pltIx <- 1
    for (mdlId in pltMdls) {
        print(ggplot(glb_models_lst[[mdlId]], highBestTune = TRUE) + labs(title = mdlId),   
              vp = viewport(layout.pos.row = ceiling(pltIx / 2.0), 
                            layout.pos.col = ((pltIx - 1) %% 2) + 1))  
        pltIx <- pltIx + 1
    }
    dev.off()

    if (all(row.names(dsp_models_df) != dsp_models_df$id))
        row.names(dsp_models_df) <- dsp_models_df$id
    return(dsp_models_df)
}
#glbgetDisplayModelsDf()

glb_get_predictions <- function(df, mdl_id, rsp_var, prob_threshold_def=NULL, verbose=FALSE) {
    mdl <- glb_models_lst[[mdl_id]]
    
    clmnNames <- mygetPredictIds(rsp_var, mdl_id)
    predct_var_name <- clmnNames$value        
    predct_prob_var_name <- clmnNames$prob
    predct_accurate_var_name <- clmnNames$is.acc
    predct_error_var_name <- clmnNames$err
    predct_erabs_var_name <- clmnNames$err.abs

    if (glb_is_regression) {
        df[, predct_var_name] <- predict(mdl, newdata=df, type="raw")
        if (verbose) print(myplot_scatter(df, glb_rsp_var, predct_var_name) + 
                  facet_wrap(reformulate(glbFeatsCategory), scales = "free") + 
                  stat_smooth(method="glm"))

        df[, predct_error_var_name] <- df[, predct_var_name] - df[, glb_rsp_var]
        if (verbose) print(myplot_scatter(df, predct_var_name, predct_error_var_name) + 
                  #facet_wrap(reformulate(glbFeatsCategory), scales = "free") + 
                  stat_smooth(method="auto"))
        if (verbose) print(myplot_scatter(df, glb_rsp_var, predct_error_var_name) + 
                  #facet_wrap(reformulate(glbFeatsCategory), scales = "free") + 
                  stat_smooth(method="glm"))
        
        df[, predct_erabs_var_name] <- abs(df[, predct_error_var_name])
        if (verbose) print(head(orderBy(reformulate(c("-", predct_erabs_var_name)), df)))
        
        df[, predct_accurate_var_name] <- (df[, glb_rsp_var] == df[, predct_var_name])
    }

    if (glb_is_classification && glb_is_binomial) {
        prob_threshold <- glb_models_df[glb_models_df$id == mdl_id, 
                                        "opt.prob.threshold.OOB"]
        if (is.null(prob_threshold) || is.na(prob_threshold)) {
            warning("Using default probability threshold: ", prob_threshold_def)
            if (is.null(prob_threshold <- prob_threshold_def))
                stop("Default probability threshold is NULL")
        }
        
        df[, predct_prob_var_name] <- predict(mdl, newdata = df, type = "prob")[, 2]
        df[, predct_var_name] <- 
                factor(levels(df[, glb_rsp_var])[
                    (df[, predct_prob_var_name] >=
                        prob_threshold) * 1 + 1], levels(df[, glb_rsp_var]))
    
#         if (verbose) print(myplot_scatter(df, glb_rsp_var, predct_var_name) + 
#                   facet_wrap(reformulate(glbFeatsCategory), scales = "free") + 
#                   stat_smooth(method="glm"))

        df[, predct_error_var_name] <- df[, predct_var_name] != df[, glb_rsp_var]
#         if (verbose) print(myplot_scatter(df, predct_var_name, predct_error_var_name) + 
#                   #facet_wrap(reformulate(glbFeatsCategory), scales = "free") + 
#                   stat_smooth(method="auto"))
#         if (verbose) print(myplot_scatter(df, glb_rsp_var, predct_error_var_name) + 
#                   #facet_wrap(reformulate(glbFeatsCategory), scales = "free") + 
#                   stat_smooth(method="glm"))
        
        # if prediction is a TP (true +ve), measure distance from 1.0
        tp <- which((df[, predct_var_name] == df[, glb_rsp_var]) &
                    (df[, predct_var_name] == levels(df[, glb_rsp_var])[2]))
        df[tp, predct_erabs_var_name] <- abs(1 - df[tp, predct_prob_var_name])
        #rowIx <- which.max(df[tp, predct_erabs_var_name]); df[tp, c(glbFeatsId, glb_rsp_var, predct_var_name, predct_prob_var_name, predct_erabs_var_name)][rowIx, ]
        
        # if prediction is a TN (true -ve), measure distance from 0.0
        tn <- which((df[, predct_var_name] == df[, glb_rsp_var]) &
                    (df[, predct_var_name] == levels(df[, glb_rsp_var])[1]))
        df[tn, predct_erabs_var_name] <- abs(0 - df[tn, predct_prob_var_name])
        #rowIx <- which.max(df[tn, predct_erabs_var_name]); df[tn, c(glbFeatsId, glb_rsp_var, predct_var_name, predct_prob_var_name, predct_erabs_var_name)][rowIx, ]
        
        # if prediction is a FP (flse +ve), measure distance from 0.0
        fp <- which((df[, predct_var_name] != df[, glb_rsp_var]) &
                    (df[, predct_var_name] == levels(df[, glb_rsp_var])[2]))
        df[fp, predct_erabs_var_name] <- abs(0 - df[fp, predct_prob_var_name])
        #rowIx <- which.max(df[fp, predct_erabs_var_name]); df[fp, c(glbFeatsId, glb_rsp_var, predct_var_name, predct_prob_var_name, predct_erabs_var_name)][rowIx, ]
        
        # if prediction is a FN (flse -ve), measure distance from 1.0
        fn <- which((df[, predct_var_name] != df[, glb_rsp_var]) &
                    (df[, predct_var_name] == levels(df[, glb_rsp_var])[1]))
        df[fn, predct_erabs_var_name] <- abs(1 - df[fn, predct_prob_var_name])
        #rowIx <- which.max(df[fn, predct_erabs_var_name]); df[fn, c(glbFeatsId, glb_rsp_var, predct_var_name, predct_prob_var_name, predct_erabs_var_name)][rowIx, ]

        
        if (verbose) print(head(orderBy(reformulate(c("-", predct_erabs_var_name)), df)))
        
        df[, predct_accurate_var_name] <- (df[, glb_rsp_var] == df[, predct_var_name])
    }    
    
    if (glb_is_classification && !glb_is_binomial) {
        df[, predct_var_name] <- predict(mdl, newdata = df, type = "raw")
        probCls <- predict(mdl, newdata = df, type = "prob")        
        df[, predct_prob_var_name] <- NA
        for (cls in names(probCls)) {
            mask <- (df[, predct_var_name] == cls)
            df[mask, predct_prob_var_name] <- probCls[mask, cls]
        }    
        if (verbose) print(myplot_histogram(df, predct_prob_var_name, 
                                            fill_col_name = predct_var_name))
        if (verbose) print(myplot_histogram(df, predct_prob_var_name, 
                                            facet_frmla = paste0("~", glb_rsp_var)))
        
        df[, predct_error_var_name] <- df[, predct_var_name] != df[, glb_rsp_var]
        
        # if prediction is erroneous, measure predicted class prob from actual class prob
        df[, predct_erabs_var_name] <- 0
        for (cls in names(probCls)) {
            mask <- (df[, glb_rsp_var] == cls) & (df[, predct_error_var_name])
            df[mask, predct_erabs_var_name] <- probCls[mask, cls]
        }    

        df[, predct_accurate_var_name] <- (df[, glb_rsp_var] == df[, predct_var_name])        
    }

    return(df)
}    

if (glb_is_classification && glb_is_binomial && 
        (length(unique(glbObsFit[, glb_rsp_var])) < 2))
    stop("glbObsFit$", glb_rsp_var, ": contains less than 2 unique values: ",
         paste0(unique(glbObsFit[, glb_rsp_var]), collapse=", "))

max_cor_y_x_vars <- orderBy(~ -cor.y.abs, 
        subset(glb_feats_df, (exclude.as.feat == 0) & !nzv & !is.cor.y.abs.low & 
                                is.na(cor.high.X)))[1:2, "id"]
max_cor_y_x_vars <- max_cor_y_x_vars[!is.na(max_cor_y_x_vars)]
if (length(max_cor_y_x_vars) < 2)
    max_cor_y_x_vars <- union(max_cor_y_x_vars, ".pos")

if (!is.null(glb_Baseline_mdl_var)) {
    if ((max_cor_y_x_vars[1] != glb_Baseline_mdl_var) & 
        (glb_feats_df[glb_feats_df$id == max_cor_y_x_vars[1], "cor.y.abs"] > 
         glb_feats_df[glb_feats_df$id == glb_Baseline_mdl_var, "cor.y.abs"]))
        stop(max_cor_y_x_vars[1], " has a higher correlation with ", glb_rsp_var, 
             " than the Baseline var: ", glb_Baseline_mdl_var)
}

glb_model_type <- ifelse(glb_is_regression, "regression", "classification")
    
# Model specs
# c("id.prefix", "method", "type",
#   # trainControl params
#   "preProc.method", "cv.n.folds", "cv.n.repeats", "summary.fn",
#   # train params
#   "metric", "metric.maximize", "tune.df")

# Baseline
if (!is.null(glb_Baseline_mdl_var)) {
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                            paste0("fit.models_0_", "Baseline"), major.inc = FALSE,
                                    label.minor = "mybaseln_classfr")
    ret_lst <- myfit_mdl(mdl_id="Baseline", 
                         model_method="mybaseln_classfr",
                        indepVar=glb_Baseline_mdl_var,
                        rsp_var=glb_rsp_var,
                        fit_df=glbObsFit, OOB_df=glbObsOOB)
}    

# Most Frequent Outcome "MFO" model: mean(y) for regression
#   Not using caret's nullModel since model stats not avl
#   Cannot use rpart for multinomial classification since it predicts non-MFO
if (glb_is_classification) {
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                                paste0("fit.models_0_", "MFO"), major.inc = FALSE,
                                        label.minor = "myMFO_classfr")

    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
        id.prefix = "MFO", type = glb_model_type, trainControl.method = "none",
        train.method = ifelse(glb_is_regression, "lm", "myMFO_classfr"))),
                            indepVar = ".rnorm", rsp_var = glb_rsp_var,
                            fit_df = glbObsFit, OOB_df = glbObsOOB)

        # "random" model - only for classification; 
        #   none needed for regression since it is same as MFO
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                                paste0("fit.models_0_", "Random"), major.inc = FALSE,
                                        label.minor = "myrandom_classfr")

#stop(here"); glb2Sav(); all.equal(glb_models_df, sav_models_df)    
    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
        id.prefix = "Random", type = glb_model_type, trainControl.method = "none",
        train.method = "myrandom_classfr")),
                        indepVar = ".rnorm", rsp_var = glb_rsp_var,
                        fit_df = glbObsFit, OOB_df = glbObsOOB)
}
##              label step_major step_minor   label_minor     bgn     end
## 1 fit.models_0_bgn          1          0         setup 108.647 108.682
## 2 fit.models_0_MFO          1          1 myMFO_classfr 108.682      NA
##   elapsed
## 1   0.035
## 2      NA
## [1] "myfit_mdl: enter: 0.002000 secs"
## [1] "myfit_mdl: fitting model: MFO###myMFO_classfr"
## [1] "    indepVar: .rnorm"
## [1] "myfit_mdl: setup complete: 0.444000 secs"
## Fitting parameter = none on full training set
## [1] "in MFO.Classifier$fit"
## [1] "unique.vals:"
## [1] D R
## Levels: D R
## [1] "unique.prob:"
## y
##         R         D 
## 0.5779022 0.4220978 
## [1] "MFO.val:"
## [1] "R"
## [1] "myfit_mdl: train complete: 0.810000 secs"
##   parameter
## 1      none
##             Length Class      Mode     
## unique.vals 2      factor     numeric  
## unique.prob 2      -none-     numeric  
## MFO.val     1      -none-     character
## x.names     1      -none-     character
## xNames      1      -none-     character
## problemType 1      -none-     character
## tuneValue   1      data.frame list     
## obsLevels   2      -none-     character
## Warning in if (mdl_specs_lst[["train.method"]] == "glm")
## mydisplayOutliers(mdl, : the condition has length > 1 and only the first
## element will be used
## [1] "myfit_mdl: train diagnostics complete: 0.813000 secs"
## Loading required namespace: pROC
## [1] "entr MFO.Classifier$predict"
## [1] "exit MFO.Classifier$predict"
## Loading required package: ROCR
## Loading required package: gplots
## 
## Attaching package: 'gplots'
## The following object is masked from 'package:stats':
## 
##     lowess
## [1] "in MFO.Classifier$prob"
##           D         R
## 1 0.5779022 0.4220978
## 2 0.5779022 0.4220978
## 3 0.5779022 0.4220978
## 4 0.5779022 0.4220978
## 5 0.5779022 0.4220978
## 6 0.5779022 0.4220978

##          Prediction
## Reference    D    R
##         D    0  829
##         R    0 1135
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.779022e-01   0.000000e+00   5.557019e-01   5.998696e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   5.095851e-01  7.271654e-182 
## [1] "entr MFO.Classifier$predict"
## [1] "exit MFO.Classifier$predict"
## [1] "in MFO.Classifier$prob"
##           D         R
## 1 0.5779022 0.4220978
## 2 0.5779022 0.4220978
## 3 0.5779022 0.4220978
## 4 0.5779022 0.4220978
## 5 0.5779022 0.4220978
## 6 0.5779022 0.4220978
##          Prediction
## Reference   D   R
##         D   0 209
##         R   0 286
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.777778e-01   0.000000e+00   5.329004e-01   6.217207e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   5.190818e-01   6.185272e-47 
## [1] "myfit_mdl: predict complete: 6.427000 secs"
##                    id  feats max.nTuningRuns min.elapsedtime.everything
## 1 MFO###myMFO_classfr .rnorm               0                      0.357
##   min.elapsedtime.final max.AUCpROC.fit max.Sens.fit max.Spec.fit
## 1                 0.004             0.5            0            1
##   max.AUCROCR.fit opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1             0.5                    0.4       0.7324944        0.5779022
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.5557019             0.5998696             0
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1             0.5            0            1             0.5
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                    0.4       0.7323944        0.5777778
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5329004             0.6217207             0
## [1] "in MFO.Classifier$prob"
##           D         R
## 1 0.5779022 0.4220978
## 2 0.5779022 0.4220978
## 3 0.5779022 0.4220978
## 4 0.5779022 0.4220978
## 5 0.5779022 0.4220978
## 6 0.5779022 0.4220978
## [1] "myfit_mdl: exit: 6.511000 secs"
##                 label step_major step_minor      label_minor     bgn
## 2    fit.models_0_MFO          1          1    myMFO_classfr 108.682
## 3 fit.models_0_Random          1          2 myrandom_classfr 115.199
##       end elapsed
## 2 115.198   6.516
## 3      NA      NA
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: Random###myrandom_classfr"
## [1] "    indepVar: .rnorm"
## [1] "myfit_mdl: setup complete: 0.423000 secs"
## Fitting parameter = none on full training set
## [1] "myfit_mdl: train complete: 0.708000 secs"
##   parameter
## 1      none
##             Length Class      Mode     
## unique.vals 2      factor     numeric  
## unique.prob 2      table      numeric  
## xNames      1      -none-     character
## problemType 1      -none-     character
## tuneValue   1      data.frame list     
## obsLevels   2      -none-     character
## Warning in if (mdl_specs_lst[["train.method"]] == "glm")
## mydisplayOutliers(mdl, : the condition has length > 1 and only the first
## element will be used

## [1] "myfit_mdl: train diagnostics complete: 0.710000 secs"
## [1] "in Random.Classifier$prob"

##          Prediction
## Reference    D    R
##         D    0  829
##         R    0 1135
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.779022e-01   0.000000e+00   5.557019e-01   5.998696e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   5.095851e-01  7.271654e-182 
## [1] "in Random.Classifier$prob"

##          Prediction
## Reference   D   R
##         D   0 209
##         R   0 286
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.777778e-01   0.000000e+00   5.329004e-01   6.217207e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   5.190818e-01   6.185272e-47 
## [1] "myfit_mdl: predict complete: 7.078000 secs"
##                          id  feats max.nTuningRuns
## 1 Random###myrandom_classfr .rnorm               0
##   min.elapsedtime.everything min.elapsedtime.final max.AUCpROC.fit
## 1                       0.28                 0.002        0.505974
##   max.Sens.fit max.Spec.fit max.AUCROCR.fit opt.prob.threshold.fit
## 1    0.4234017    0.5885463       0.5019826                    0.4
##   max.f.score.fit max.Accuracy.fit max.AccuracyLower.fit
## 1       0.7324944        0.5779022             0.5557019
##   max.AccuracyUpper.fit max.Kappa.fit max.AUCpROC.OOB max.Sens.OOB
## 1             0.5998696             0       0.5026684    0.4354067
##   max.Spec.OOB max.AUCROCR.OOB opt.prob.threshold.OOB max.f.score.OOB
## 1    0.5699301       0.5145381                    0.4       0.7323944
##   max.Accuracy.OOB max.AccuracyLower.OOB max.AccuracyUpper.OOB
## 1        0.5777778             0.5329004             0.6217207
##   max.Kappa.OOB
## 1             0
## [1] "in Random.Classifier$prob"
## [1] "myfit_mdl: exit: 7.881000 secs"
# Max.cor.Y
#   Check impact of cv
#       rpart is not a good candidate since caret does not optimize cp (only tuning parameter of rpart) well
fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                        paste0("fit.models_0_", "Max.cor.Y.rcv.*X*"), major.inc = FALSE,
                                    label.minor = "glmnet")
##                            label step_major step_minor      label_minor
## 3            fit.models_0_Random          1          2 myrandom_classfr
## 4 fit.models_0_Max.cor.Y.rcv.*X*          1          3           glmnet
##       bgn     end elapsed
## 3 115.199 123.093   7.894
## 4 123.093      NA      NA
ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
    id.prefix = "Max.cor.Y.rcv.1X1", type = glb_model_type, trainControl.method = "none",
    train.method = "glmnet")),
                    indepVar = max_cor_y_x_vars, rsp_var = glb_rsp_var, 
                    fit_df = glbObsFit, OOB_df = glbObsOOB)
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: Max.cor.Y.rcv.1X1###glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr"
## [1] "myfit_mdl: setup complete: 0.689000 secs"
## Loading required package: glmnet
## Loading required package: Matrix
## 
## Attaching package: 'Matrix'
## The following object is masked from 'package:tidyr':
## 
##     expand
## Loaded glmnet 2.0-5
## Fitting alpha = 0.1, lambda = 0.00175 on full training set
## [1] "myfit_mdl: train complete: 1.392000 secs"
##   alpha     lambda
## 1   0.1 0.00174551

##             Length Class      Mode     
## a0           51    -none-     numeric  
## beta        204    dgCMatrix  S4       
## df           51    -none-     numeric  
## dim           2    -none-     numeric  
## lambda       51    -none-     numeric  
## dev.ratio    51    -none-     numeric  
## nulldev       1    -none-     numeric  
## npasses       1    -none-     numeric  
## jerr          1    -none-     numeric  
## offset        1    -none-     logical  
## classnames    2    -none-     character
## call          5    -none-     call     
## nobs          1    -none-     numeric  
## lambdaOpt     1    -none-     numeric  
## xNames        4    -none-     character
## problemType   1    -none-     character
## tuneValue     2    data.frame list     
## obsLevels     2    -none-     character
## [1] "min lambda > lambdaOpt:"
##     (Intercept)  Q113181.fctrNo Q113181.fctrYes  Q115611.fctrNo 
##       0.2711735      -0.3237413       0.2651386      -0.1803110 
## Q115611.fctrYes 
##       0.5581858 
## [1] "max lambda < lambdaOpt:"
## [1] "Feats mismatch between coefs_left & rght:"
## [1] "(Intercept)"     "Q113181.fctrNo"  "Q113181.fctrYes" "Q115611.fctrNo" 
## [5] "Q115611.fctrYes"
## [1] "myfit_mdl: train diagnostics complete: 1.502000 secs"

##          Prediction
## Reference   D   R
##         D 346 483
##         R 267 868
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.181263e-01   1.887578e-01   5.962212e-01   6.396780e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   1.575573e-04   4.137546e-15

##          Prediction
## Reference   D   R
##         D   0 209
##         R   0 286
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.777778e-01   0.000000e+00   5.329004e-01   6.217207e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   5.190818e-01   6.185272e-47 
## [1] "myfit_mdl: predict complete: 7.328000 secs"
##                           id                     feats max.nTuningRuns
## 1 Max.cor.Y.rcv.1X1###glmnet Q115611.fctr,Q113181.fctr               0
##   min.elapsedtime.everything min.elapsedtime.final max.AUCpROC.fit
## 1                      0.689                  0.03        0.591064
##   max.Sens.fit max.Spec.fit max.AUCROCR.fit opt.prob.threshold.fit
## 1    0.4173703    0.7647577       0.6295271                    0.5
##   max.f.score.fit max.Accuracy.fit max.AccuracyLower.fit
## 1       0.6983105        0.6181263             0.5962212
##   max.AccuracyUpper.fit max.Kappa.fit max.AUCpROC.OOB max.Sens.OOB
## 1              0.639678     0.1887578       0.5274199    0.3205742
##   max.Spec.OOB max.AUCROCR.OOB opt.prob.threshold.OOB max.f.score.OOB
## 1    0.7342657       0.5358433                    0.4       0.7323944
##   max.Accuracy.OOB max.AccuracyLower.OOB max.AccuracyUpper.OOB
## 1        0.5777778             0.5329004             0.6217207
##   max.Kappa.OOB
## 1             0
## [1] "myfit_mdl: exit: 7.419000 secs"
if (glbMdlCheckRcv) {
    # rcv_n_folds == 1 & rcv_n_repeats > 1 crashes
    for (rcv_n_folds in seq(3, glb_rcv_n_folds + 2, 2))
        for (rcv_n_repeats in seq(1, glb_rcv_n_repeats + 2, 2)) {
            
            # Experiment specific code to avoid caret crash
    #         lcl_tune_models_df <- rbind(data.frame()
    #                             ,data.frame(method = "glmnet", parameter = "alpha", 
    #                                         vals = "0.100 0.325 0.550 0.775 1.000")
    #                             ,data.frame(method = "glmnet", parameter = "lambda",
    #                                         vals = "9.342e-02")    
    #                                     )
            
            ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
                list(
                id.prefix = paste0("Max.cor.Y.rcv.", rcv_n_folds, "X", rcv_n_repeats), 
                type = glb_model_type, 
    # tune.df = lcl_tune_models_df,            
                trainControl.method = "repeatedcv",
                trainControl.number = rcv_n_folds, 
                trainControl.repeats = rcv_n_repeats,
                trainControl.classProbs = glb_is_classification,
                trainControl.summaryFunction = glbMdlMetricSummaryFn,
                train.method = "glmnet", train.metric = glbMdlMetricSummary, 
                train.maximize = glbMdlMetricMaximize)),
                                indepVar = max_cor_y_x_vars, rsp_var = glb_rsp_var, 
                                fit_df = glbObsFit, OOB_df = glbObsOOB)
        }
    # Add parallel coordinates graph of glb_models_df[, glbMdlMetricsEval] to evaluate cv parameters
    tmp_models_cols <- c("id", "max.nTuningRuns",
                        glbMdlMetricsEval[glbMdlMetricsEval %in% names(glb_models_df)],
                        grep("opt.", names(glb_models_df), fixed = TRUE, value = TRUE)) 
    print(myplot_parcoord(obs_df = subset(glb_models_df, 
                                          grepl("Max.cor.Y.rcv.", id, fixed = TRUE), 
                                            select = -feats)[, tmp_models_cols],
                          id_var = "id"))
}
        
# Useful for stacking decisions
# fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
#                     paste0("fit.models_0_", "Max.cor.Y[rcv.1X1.cp.0|]"), major.inc = FALSE,
#                                     label.minor = "rpart")
# 
# ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
#     id.prefix = "Max.cor.Y.rcv.1X1.cp.0", type = glb_model_type, trainControl.method = "none",
#     train.method = "rpart",
#     tune.df=data.frame(method="rpart", parameter="cp", min=0.0, max=0.0, by=0.1))),
#                     indepVar=max_cor_y_x_vars, rsp_var=glb_rsp_var, 
#                     fit_df=glbObsFit, OOB_df=glbObsOOB)

#stop(here"); glb2Sav(); all.equal(glb_models_df, sav_models_df)
# if (glb_is_regression || glb_is_binomial) # For multinomials this model will be run next by default
ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
                        id.prefix = "Max.cor.Y", 
                        type = glb_model_type, trainControl.method = "repeatedcv",
                        trainControl.number = glb_rcv_n_folds, 
                        trainControl.repeats = glb_rcv_n_repeats,
                        trainControl.classProbs = glb_is_classification,
                        trainControl.summaryFunction = glbMdlMetricSummaryFn,
                        trainControl.allowParallel = glbMdlAllowParallel,                        
                        train.metric = glbMdlMetricSummary, 
                        train.maximize = glbMdlMetricMaximize,    
                        train.method = "rpart")),
                    indepVar = max_cor_y_x_vars, rsp_var = glb_rsp_var, 
                    fit_df = glbObsFit, OOB_df = glbObsOOB)
## [1] "myfit_mdl: enter: 0.000000 secs"
## [1] "myfit_mdl: fitting model: Max.cor.Y##rcv#rpart"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr"
## [1] "myfit_mdl: setup complete: 0.700000 secs"
## Loading required package: rpart
## Aggregating results
## Selecting tuning parameters
## Fitting cp = 0 on full training set
## [1] "myfit_mdl: train complete: 2.097000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst
## = list(id.prefix = "Max.cor.Y", : model's bestTune found at an extreme of
## tuneGrid for parameter: cp
## Loading required package: rpart.plot

## Call:
## rpart(formula = .outcome ~ ., control = list(minsplit = 20, minbucket = 7, 
##     cp = 0, maxcompete = 4, maxsurrogate = 5, usesurrogate = 2, 
##     surrogatestyle = 0, maxdepth = 30, xval = 0))
##   n= 1964 
## 
##           CP nsplit rel error
## 1 0.04764777      0 1.0000000
## 2 0.00000000      2 0.9047045
## 
## Variable importance
## Q115611.fctrYes  Q115611.fctrNo  Q113181.fctrNo Q113181.fctrYes 
##              41              27              20              12 
## 
## Node number 1: 1964 observations,    complexity param=0.04764777
##   predicted class=R  expected loss=0.4220978  P(node) =1
##     class counts:   829  1135
##    probabilities: 0.422 0.578 
##   left son=2 (1270 obs) right son=3 (694 obs)
##   Primary splits:
##       Q115611.fctrYes < 0.5 to the left,  improve=29.91963, (0 missing)
##       Q115611.fctrNo  < 0.5 to the right, improve=23.19226, (0 missing)
##       Q113181.fctrYes < 0.5 to the left,  improve=20.22155, (0 missing)
##       Q113181.fctrNo  < 0.5 to the right, improve=19.11453, (0 missing)
##   Surrogate splits:
##       Q115611.fctrNo < 0.5 to the right, agree=0.85, adj=0.575, (0 split)
## 
## Node number 2: 1270 observations,    complexity param=0.04764777
##   predicted class=R  expected loss=0.4866142  P(node) =0.6466395
##     class counts:   618   652
##    probabilities: 0.487 0.513 
##   left son=4 (613 obs) right son=5 (657 obs)
##   Primary splits:
##       Q113181.fctrNo  < 0.5 to the right, improve=14.353080, (0 missing)
##       Q113181.fctrYes < 0.5 to the left,  improve=12.272800, (0 missing)
##       Q115611.fctrNo  < 0.5 to the right, improve= 1.391157, (0 missing)
##   Surrogate splits:
##       Q113181.fctrYes < 0.5 to the left,  agree=0.810, adj=0.607, (0 split)
##       Q115611.fctrNo  < 0.5 to the right, agree=0.595, adj=0.162, (0 split)
## 
## Node number 3: 694 observations
##   predicted class=R  expected loss=0.3040346  P(node) =0.3533605
##     class counts:   211   483
##    probabilities: 0.304 0.696 
## 
## Node number 4: 613 observations
##   predicted class=D  expected loss=0.4355628  P(node) =0.3121181
##     class counts:   346   267
##    probabilities: 0.564 0.436 
## 
## Node number 5: 657 observations
##   predicted class=R  expected loss=0.414003  P(node) =0.3345214
##     class counts:   272   385
##    probabilities: 0.414 0.586 
## 
## n= 1964 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
## 1) root 1964 829 R (0.4220978 0.5779022)  
##   2) Q115611.fctrYes< 0.5 1270 618 R (0.4866142 0.5133858)  
##     4) Q113181.fctrNo>=0.5 613 267 D (0.5644372 0.4355628) *
##     5) Q113181.fctrNo< 0.5 657 272 R (0.4140030 0.5859970) *
##   3) Q115611.fctrYes>=0.5 694 211 R (0.3040346 0.6959654) *
## [1] "myfit_mdl: train diagnostics complete: 2.968000 secs"

## [1] "mypredict_mdl: maxMetricDf:"
##    threshold   f.score  accuracy
## 10      0.45 0.6983105 0.6181263
## 11      0.50 0.6983105 0.6181263
## 12      0.55 0.6983105 0.6181263

##          Prediction
## Reference   D   R
##         D 346 483
##         R 267 868
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.181263e-01   1.887578e-01   5.962212e-01   6.396780e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   1.575573e-04   4.137546e-15

##          Prediction
## Reference   D   R
##         D   0 209
##         R   0 286
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.777778e-01   0.000000e+00   5.329004e-01   6.217207e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   5.190818e-01   6.185272e-47 
## [1] "myfit_mdl: predict complete: 8.836000 secs"
##                     id                     feats max.nTuningRuns
## 1 Max.cor.Y##rcv#rpart Q115611.fctr,Q113181.fctr               5
##   min.elapsedtime.everything min.elapsedtime.final max.AUCpROC.fit
## 1                      1.391                 0.012        0.591064
##   max.Sens.fit max.Spec.fit max.AUCROCR.fit opt.prob.threshold.fit
## 1    0.4173703    0.7647577       0.6177088                    0.5
##   max.f.score.fit max.Accuracy.fit max.AccuracyLower.fit
## 1       0.6983105        0.6140468             0.5962212
##   max.AccuracyUpper.fit max.Kappa.fit max.AUCpROC.OOB max.Sens.OOB
## 1              0.639678     0.1857728       0.5274199    0.3205742
##   max.Spec.OOB max.AUCROCR.OOB opt.prob.threshold.OOB max.f.score.OOB
## 1    0.7342657       0.5169806                    0.4       0.7323944
##   max.Accuracy.OOB max.AccuracyLower.OOB max.AccuracyUpper.OOB
## 1        0.5777778             0.5329004             0.6217207
##   max.Kappa.OOB max.AccuracySD.fit max.KappaSD.fit
## 1             0         0.02017618      0.04263554
## [1] "myfit_mdl: exit: 8.899000 secs"
if ((length(glbFeatsDateTime) > 0) && 
    (sum(grepl(paste(names(glbFeatsDateTime), "\\.day\\.minutes\\.poly\\.", sep = ""),
               names(glbObsAll))) > 0)) {
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                    paste0("fit.models_0_", "Max.cor.Y.Time.Poly"), major.inc = FALSE,
                                    label.minor = "glmnet")

    indepVars <- c(max_cor_y_x_vars, 
            grep(paste(names(glbFeatsDateTime), "\\.day\\.minutes\\.poly\\.", sep = ""),
                        names(glbObsAll), value = TRUE))
    indepVars <- myadjustInteractionFeats(glb_feats_df, indepVars)
    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
            id.prefix = "Max.cor.Y.Time.Poly", 
            type = glb_model_type, trainControl.method = "repeatedcv",
            trainControl.number = glb_rcv_n_folds, trainControl.repeats = glb_rcv_n_repeats,
            trainControl.classProbs = glb_is_classification,
            trainControl.summaryFunction = glbMdlMetricSummaryFn,
            trainControl.allowParallel = glbMdlAllowParallel,            
            train.metric = glbMdlMetricSummary, 
            train.maximize = glbMdlMetricMaximize,    
            train.method = "glmnet")),
        indepVar = indepVars,
        rsp_var = glb_rsp_var, 
        fit_df = glbObsFit, OOB_df = glbObsOOB)
}

if ((length(glbFeatsDateTime) > 0) && 
    (sum(grepl(paste(names(glbFeatsDateTime), "\\.last[[:digit:]]", sep = ""),
               names(glbObsAll))) > 0)) {
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                    paste0("fit.models_0_", "Max.cor.Y.Time.Lag"), major.inc = FALSE,
                                    label.minor = "glmnet")

    indepVars <- c(max_cor_y_x_vars, 
            grep(paste(names(glbFeatsDateTime), "\\.last[[:digit:]]", sep = ""),
                        names(glbObsAll), value = TRUE))
    indepVars <- myadjustInteractionFeats(glb_feats_df, indepVars)
    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
        id.prefix = "Max.cor.Y.Time.Lag", 
        type = glb_model_type, 
        tune.df = glbMdlTuneParams,        
        trainControl.method = "repeatedcv",
        trainControl.number = glb_rcv_n_folds, trainControl.repeats = glb_rcv_n_repeats,
        trainControl.classProbs = glb_is_classification,
        trainControl.summaryFunction = glbMdlMetricSummaryFn,
        trainControl.allowParallel = glbMdlAllowParallel,        
        train.metric = glbMdlMetricSummary, 
        train.maximize = glbMdlMetricMaximize,    
        train.method = "glmnet")),
        indepVar = indepVars,
        rsp_var = glb_rsp_var, 
        fit_df = glbObsFit, OOB_df = glbObsOOB)
}

if (length(glbFeatsText) > 0) {
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                    paste0("fit.models_0_", "Txt.*"), major.inc = FALSE,
                                    label.minor = "glmnet")

    indepVars <- c(max_cor_y_x_vars)
    for (txtFeat in names(glbFeatsText))
        indepVars <- union(indepVars, 
            grep(paste(str_to_upper(substr(txtFeat, 1, 1)), "\\.(?!([T|P]\\.))", sep = ""),
                        names(glbObsAll), perl = TRUE, value = TRUE))
    indepVars <- myadjustInteractionFeats(glb_feats_df, indepVars)
    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
        id.prefix = "Max.cor.Y.Text.nonTP", 
        type = glb_model_type, 
        tune.df = glbMdlTuneParams,        
        trainControl.method = "repeatedcv",
        trainControl.number = glb_rcv_n_folds, trainControl.repeats = glb_rcv_n_repeats,
        trainControl.classProbs = glb_is_classification,
        trainControl.summaryFunction = glbMdlMetricSummaryFn,
        trainControl.allowParallel = glbMdlAllowParallel,                                
        train.metric = glbMdlMetricSummary, 
        train.maximize = glbMdlMetricMaximize,    
        train.method = "glmnet")),
        indepVar = indepVars,
        rsp_var = glb_rsp_var, 
        fit_df = glbObsFit, OOB_df = glbObsOOB)

    indepVars <- c(max_cor_y_x_vars)
    for (txtFeat in names(glbFeatsText))
        indepVars <- union(indepVars, 
            grep(paste(str_to_upper(substr(txtFeat, 1, 1)), "\\.T\\.", sep = ""),
                        names(glbObsAll), perl = TRUE, value = TRUE))
    indepVars <- myadjustInteractionFeats(glb_feats_df, indepVars)
    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
        id.prefix = "Max.cor.Y.Text.onlyT", 
        type = glb_model_type, 
        tune.df = glbMdlTuneParams,        
        trainControl.method = "repeatedcv",
        trainControl.number = glb_rcv_n_folds, trainControl.repeats = glb_rcv_n_repeats,
        trainControl.classProbs = glb_is_classification,
        trainControl.summaryFunction = glbMdlMetricSummaryFn,
        train.metric = glbMdlMetricSummary, 
        train.maximize = glbMdlMetricMaximize,    
        train.method = "glmnet")),
        indepVar = indepVars,
        rsp_var = glb_rsp_var, 
        fit_df = glbObsFit, OOB_df = glbObsOOB)

    indepVars <- c(max_cor_y_x_vars)
    for (txtFeat in names(glbFeatsText))
        indepVars <- union(indepVars, 
            grep(paste(str_to_upper(substr(txtFeat, 1, 1)), "\\.P\\.", sep = ""),
                        names(glbObsAll), perl = TRUE, value = TRUE))
    indepVars <- myadjustInteractionFeats(glb_feats_df, indepVars)
    ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
        id.prefix = "Max.cor.Y.Text.onlyP", 
        type = glb_model_type, 
        tune.df = glbMdlTuneParams,        
        trainControl.method = "repeatedcv",
        trainControl.number = glb_rcv_n_folds, trainControl.repeats = glb_rcv_n_repeats,
        trainControl.classProbs = glb_is_classification,
        trainControl.summaryFunction = glbMdlMetricSummaryFn,
        trainControl.allowParallel = glbMdlAllowParallel,        
        train.metric = glbMdlMetricSummary, 
        train.maximize = glbMdlMetricMaximize,    
        train.method = "glmnet")),
        indepVar = indepVars,
        rsp_var = glb_rsp_var, 
        fit_df = glbObsFit, OOB_df = glbObsOOB)
}

# Interactions.High.cor.Y
if (length(int_feats <- setdiff(setdiff(unique(glb_feats_df$cor.high.X), NA), 
                                subset(glb_feats_df, nzv)$id)) > 0) {
    fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                    paste0("fit.models_0_", "Interact.High.cor.Y"), major.inc = FALSE,
                                    label.minor = "glmnet")

    ret_lst <- myfit_mdl(mdl_specs_lst=myinit_mdl_specs_lst(mdl_specs_lst=list(
        id.prefix="Interact.High.cor.Y", 
        type=glb_model_type, trainControl.method="repeatedcv",
        trainControl.number=glb_rcv_n_folds, trainControl.repeats=glb_rcv_n_repeats,
        trainControl.classProbs = glb_is_classification,
        trainControl.summaryFunction = glbMdlMetricSummaryFn,
        trainControl.allowParallel = glbMdlAllowParallel,
        train.metric = glbMdlMetricSummary, 
        train.maximize = glbMdlMetricMaximize,    
        train.method="glmnet")),
        indepVar=c(max_cor_y_x_vars, paste(max_cor_y_x_vars[1], int_feats, sep=":")),
        rsp_var=glb_rsp_var, 
        fit_df=glbObsFit, OOB_df=glbObsOOB)
}    

# Low.cor.X
fit.models_0_chunk_df <- myadd_chunk(fit.models_0_chunk_df, 
                        paste0("fit.models_0_", "Low.cor.X"), major.inc = FALSE,
                                     label.minor = "glmnet")
##                            label step_major step_minor label_minor     bgn
## 4 fit.models_0_Max.cor.Y.rcv.*X*          1          3      glmnet 123.093
## 5         fit.models_0_Low.cor.X          1          4      glmnet 139.459
##       end elapsed
## 4 139.458  16.365
## 5      NA      NA
indepVar <- mygetIndepVar(glb_feats_df)
indepVar <- setdiff(indepVar, unique(glb_feats_df$cor.high.X))
ret_lst <- myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst = list(
            id.prefix = "Low.cor.X", 
            type = glb_model_type, 
            tune.df = glbMdlTuneParams,        
            trainControl.method = "repeatedcv",
            trainControl.number = glb_rcv_n_folds, trainControl.repeats = glb_rcv_n_repeats,
            trainControl.classProbs = glb_is_classification,
            trainControl.summaryFunction = glbMdlMetricSummaryFn,
            trainControl.allowParallel = glbMdlAllowParallel,
            train.metric = glbMdlMetricSummary, 
            train.maximize = glbMdlMetricMaximize,    
            train.method = "glmnet")),
        indepVar = indepVar, rsp_var = glb_rsp_var, 
        fit_df = glbObsFit, OOB_df = glbObsOOB)
## [1] "myfit_mdl: enter: 0.000000 secs"
## [1] "myfit_mdl: fitting model: Low.cor.X##rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.718000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 15.073000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst
## = list(id.prefix = "Low.cor.X", : model's bestTune found at an extreme of
## tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 15.767000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 25.770000 secs"
##                      id
## 1 Low.cor.X##rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     14.267                 1.296
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 26.063000 secs"
fit.models_0_chunk_df <- 
    myadd_chunk(fit.models_0_chunk_df, "fit.models_0_end", major.inc = FALSE,
                label.minor = "teardown")
##                    label step_major step_minor label_minor     bgn     end
## 5 fit.models_0_Low.cor.X          1          4      glmnet 139.459 165.561
## 6       fit.models_0_end          1          5    teardown 165.562      NA
##   elapsed
## 5  26.102
## 6      NA
rm(ret_lst)

glb_chunks_df <- myadd_chunk(glb_chunks_df, "fit.models", major.inc = FALSE)
##         label step_major step_minor label_minor     bgn     end elapsed
## 14 fit.models          7          0           0 108.107 165.576  57.469
## 15 fit.models          7          1           1 165.577      NA      NA

```{r fit.models_1, cache=FALSE, fig.height=10, fig.width=15, eval=myevlChunk(glbChunks, glbOut$pfx)}

##              label step_major step_minor label_minor     bgn end elapsed
## 1 fit.models_1_bgn          1          0       setup 169.516  NA      NA
##                label step_major step_minor label_minor     bgn     end
## 1   fit.models_1_bgn          1          0       setup 169.516 169.529
## 2 fit.models_1_All.X          1          1       setup 169.530      NA
##   elapsed
## 1   0.013
## 2      NA
##                label step_major step_minor label_minor     bgn     end
## 2 fit.models_1_All.X          1          1       setup 169.530 169.538
## 3 fit.models_1_All.X          1          2      glmnet 169.538      NA
##   elapsed
## 2   0.008
## 3      NA
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X##rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.703000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.325, lambda = 0.07 on full training set
## [1] "myfit_mdl: train complete: 14.502000 secs"

##             Length Class      Mode     
## a0             84  -none-     numeric  
## beta        21084  dgCMatrix  S4       
## df             84  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         84  -none-     numeric  
## dev.ratio      84  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.27061370        0.01326647       -0.08819781        0.07411111 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.02012811        0.03730920        0.01312634       -0.13669409 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.10839201       -0.13908704        0.31297357        0.14556708 
##   Q122120.fctrYes     Q98197.fctrNo     Q98869.fctrNo     Q99480.fctrNo 
##        0.01981429       -0.13048792       -0.06873957       -0.06847183 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##        0.2475835836        0.0224233321       -0.1265747584 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##        0.0865172276        0.0005144989        0.0398301715 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##        0.0520950487        0.0269135343       -0.1493928760 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##        0.1048488591       -0.1510882682        0.3176426185 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##        0.1615497402        0.0010406929        0.0384779635 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##        0.0072895215       -0.1388728256       -0.0828083261 
##       Q99480.fctrNo 
##       -0.0813131230 
## [1] "myfit_mdl: train diagnostics complete: 15.174000 secs"

##          Prediction
## Reference   D   R
##         D 414 415
##         R 319 816
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.262729e-01   2.218020e-01   6.044463e-01   6.477217e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   7.059185e-06   4.540176e-04

##          Prediction
## Reference   D   R
##         D  50 159
##         R  47 239
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.838384e-01   8.072194e-02   5.390116e-01   6.276579e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   4.109019e-01   1.044351e-14 
## [1] "myfit_mdl: predict complete: 24.910000 secs"
##                  id
## 1 All.X##rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     13.709                 1.303
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5754271    0.2882992    0.8625551       0.6573027
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6897718          0.60947
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6044463             0.6477217     0.1376453
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5374494    0.2392344    0.8356643        0.563188
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                    0.5       0.6988304        0.5838384
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5390116             0.6276579    0.08072194
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01543059      0.03009977
## [1] "myfit_mdl: exit: 25.559000 secs"
##                  label step_major step_minor label_minor     bgn     end
## 3   fit.models_1_All.X          1          2      glmnet 169.538 195.123
## 4 fit.models_1_preProc          1          3     preProc 195.123      NA
##   elapsed
## 3  25.585
## 4      NA
## Loading required package: gdata
## gdata: read.xls support for 'XLS' (Excel 97-2004) files ENABLED.
## 
## gdata: read.xls support for 'XLSX' (Excel 2007+) files ENABLED.
## 
## Attaching package: 'gdata'
## The following objects are masked from 'package:dplyr':
## 
##     combine, first, last
## The following object is masked from 'package:stats':
## 
##     nobs
## The following object is masked from 'package:utils':
## 
##     object.size
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#zv#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.718000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 15.796000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20172  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        246  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 16.488000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 26.240000 secs"
##                    id
## 1 All.X#zv#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     14.994                 1.389
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 26.540000 secs"
## [1] "myfit_mdl: enter: 0.000000 secs"
## [1] "myfit_mdl: fitting model: All.X#nzv#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.718000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 20.530000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             81  -none-     numeric  
## beta        18630  dgCMatrix  S4       
## df             81  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         81  -none-     numeric  
## dev.ratio      81  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        230  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy   Q101163.fctrDad    Q106997.fctrGr 
##       0.232995281       0.023468084       0.093140617       0.046368107 
##  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo   Q113181.fctrYes 
##       0.056116163       0.028904547      -0.171850825       0.086450820 
##    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight   Q122120.fctrYes 
##      -0.133769688       0.365911659       0.177911953       0.041674684 
##   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo     Q99480.fctrNo 
##       0.001364229      -0.151313553      -0.086095991      -0.084573309 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy     Q101163.fctrDad 
##         0.209866910         0.033813656         0.105064367 
##     Q106388.fctrYes      Q106997.fctrGr    Q108855.fctrYes! 
##         0.003565671         0.066629915         0.071616017 
##      Q110740.fctrPC      Q113181.fctrNo     Q113181.fctrYes 
##         0.042967516        -0.187381186         0.077929007 
##      Q115611.fctrNo     Q115611.fctrYes   Q116881.fctrRight 
##        -0.148722825         0.363774514         0.192898198 
## Q120472.fctrScience     Q122120.fctrYes     Q123621.fctrYes 
##         0.009453961         0.061093106         0.009228911 
##       Q98197.fctrNo       Q98869.fctrNo       Q99480.fctrNo 
##        -0.160184243        -0.101480408        -0.097117632 
## [1] "myfit_mdl: train diagnostics complete: 21.244000 secs"

##          Prediction
## Reference   D   R
##         D 423 406
##         R 317 818
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.318737e-01   2.343497e-01   6.101049e-01   6.532481e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   6.125960e-07   1.065047e-03

##          Prediction
## Reference   D   R
##         D  14 195
##         R   7 279
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.919192e-01   4.836683e-02   5.471702e-01   6.355636e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.776271e-01   1.545622e-39 
## [1] "myfit_mdl: predict complete: 31.305000 secs"
##                     id
## 1 All.X#nzv#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     19.728                 2.165
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5843381     0.318456    0.8502203       0.6584261
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6935142        0.6086221
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6101049             0.6532481     0.1430231
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5358852    0.2535885    0.8181818       0.5617158
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7342105        0.5919192
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5471702             0.6355636    0.04836683
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01616259       0.0321425
## [1] "myfit_mdl: exit: 31.612000 secs"
## [1] "myfit_mdl: enter: 0.000000 secs"
## [1] "myfit_mdl: fitting model: All.X#BoxCox#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.713000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## Warning in is.na(lam): is.na() applied to non-(list or vector) of type
## 'NULL'

## [1] "myfit_mdl: train complete: 20.098000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 20.770000 secs"
## Warning in is.na(lam): is.na() applied to non-(list or vector) of type
## 'NULL'
## Warning in is.na(lam): is.na() applied to non-(list or vector) of type
## 'NULL'

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03
## Warning in is.na(lam): is.na() applied to non-(list or vector) of type
## 'NULL'

## Warning in is.na(lam): is.na() applied to non-(list or vector) of type
## 'NULL'

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 30.445000 secs"
##                        id
## 1 All.X#BoxCox#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                       19.3                 1.861
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## Warning in is.na(lam): is.na() applied to non-(list or vector) of type
## 'NULL'

## [1] "myfit_mdl: exit: 30.760000 secs"
## [1] "myfit_mdl: enter: 0.000000 secs"
## [1] "myfit_mdl: fitting model: All.X#YeoJohnson#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.730000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 54.600000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             98  -none-     numeric  
## beta        24598  dgCMatrix  S4       
## df             98  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         98  -none-     numeric  
## dev.ratio      98  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 55.315000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 65.482000 secs"
##                            id
## 1 All.X#YeoJohnson#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     53.786                 6.994
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6093019
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1445959
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01436005      0.02840821
## [1] "myfit_mdl: exit: 65.824000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#expoTrans#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.701000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 55.844000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0            100  -none-     numeric  
## beta        25100  dgCMatrix  S4       
## df            100  -none-     numeric  
## dim             2  -none-     numeric  
## lambda        100  -none-     numeric  
## dev.ratio     100  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 56.549000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 66.470000 secs"
##                           id
## 1 All.X#expoTrans#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     55.054                 6.911
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6089627
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1438465
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01450064      0.02901614
## [1] "myfit_mdl: exit: 66.811000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#center#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.721000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 18.604000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.32427436        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.325377824         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 19.299000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 29.585000 secs"
##                        id
## 1 All.X#center#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     17.758                 1.489
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 29.937000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#scale#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.703000 secs"
## Warning in preProcess.default(method = "scale", x =
## structure(c(-0.480112420809766, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## Warning in preProcess.default(thresh = 0.95, k = 5, method = "scale", x
## = structure(c(-0.480112420809766, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff

## [1] "myfit_mdl: train complete: 17.900000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##       0.237984114       0.008912228      -0.020762127       0.046273476 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##       0.022991875       0.027905424       0.014138859      -0.086176124 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##       0.042435423      -0.067012792       0.174754639       0.076155233 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##       0.016832232       0.001126528      -0.074491997      -0.031155883 
##     Q99480.fctrNo 
##      -0.030845513 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.013098424        -0.026659397 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.052151384         0.001939986         0.033046109 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.035619108         0.021081604        -0.094052895 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.038351849        -0.074503974         0.173674126 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.082414860         0.004553742         0.024693943 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.005048686        -0.078610226        -0.036658911 
##       Q99480.fctrNo 
##        -0.035363738 
## [1] "myfit_mdl: train diagnostics complete: 18.586000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 28.416000 secs"
##                       id
## 1 All.X#scale#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     17.103                 1.467
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 28.821000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#center.scale#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.701000 secs"
## Warning in preProcess.default(method = c("center", "scale"), x =
## structure(c(-0.480112420809766, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## Warning in preProcess.default(thresh = 0.95, k = 5, method = c("center", :
## These variables have zero variances: Q115611.fctrNo:.clusterid.fctr4,
## Q115611.fctrYes:.clusterid.fctr4, Q115611.fctrNo:.clusterid.fctr5,
## Q115611.fctrYes:.clusterid.fctr5, YOB.Age.fctrNA:YOB.Age.dff

## [1] "myfit_mdl: train complete: 17.516000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##       0.324274358       0.008912228      -0.020762127       0.046273476 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##       0.022991875       0.027905424       0.014138859      -0.086176124 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##       0.042435423      -0.067012792       0.174754639       0.076155233 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##       0.016832232       0.001126528      -0.074491997      -0.031155883 
##     Q99480.fctrNo 
##      -0.030845513 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.325377824         0.013098424        -0.026659397 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.052151384         0.001939986         0.033046109 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.035619108         0.021081604        -0.094052895 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.038351849        -0.074503974         0.173674126 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.082414860         0.004553742         0.024693943 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.005048686        -0.078610226        -0.036658911 
##       Q99480.fctrNo 
##        -0.035363738 
## [1] "myfit_mdl: train diagnostics complete: 18.203000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 28.160000 secs"
##                              id
## 1 All.X#center.scale#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     16.713                  1.63
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 28.559000 secs"
## [1] "myfit_mdl: enter: 0.002000 secs"
## [1] "myfit_mdl: fitting model: All.X#range#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.707000 secs"
## Warning in preProcess.default(method = "range", x =
## structure(c(-0.480112420809766, : No variation for for:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## Warning in preProcess.default(thresh = 0.95, k = 5, method =
## "range", x = structure(c(-0.480112420809766, : No variation for for:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff

## [1] "myfit_mdl: train complete: 17.840000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 18.524000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 28.470000 secs"
##                       id
## 1 All.X#range#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     17.036                 1.524
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 28.921000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#zv.pca#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.713000 secs"
## + Fold1.Rep1: alpha=0.100, lambda=0.04607 
## - Fold1.Rep1: alpha=0.100, lambda=0.04607 
## + Fold1.Rep1: alpha=0.325, lambda=0.04607 
## - Fold1.Rep1: alpha=0.325, lambda=0.04607 
## + Fold1.Rep1: alpha=0.550, lambda=0.04607 
## - Fold1.Rep1: alpha=0.550, lambda=0.04607 
## + Fold1.Rep1: alpha=0.775, lambda=0.04607 
## - Fold1.Rep1: alpha=0.775, lambda=0.04607 
## + Fold1.Rep1: alpha=1.000, lambda=0.04607 
## - Fold1.Rep1: alpha=1.000, lambda=0.04607 
## + Fold2.Rep1: alpha=0.100, lambda=0.04607 
## - Fold2.Rep1: alpha=0.100, lambda=0.04607 
## + Fold2.Rep1: alpha=0.325, lambda=0.04607 
## - Fold2.Rep1: alpha=0.325, lambda=0.04607 
## + Fold2.Rep1: alpha=0.550, lambda=0.04607 
## - Fold2.Rep1: alpha=0.550, lambda=0.04607 
## + Fold2.Rep1: alpha=0.775, lambda=0.04607 
## - Fold2.Rep1: alpha=0.775, lambda=0.04607 
## + Fold2.Rep1: alpha=1.000, lambda=0.04607 
## - Fold2.Rep1: alpha=1.000, lambda=0.04607 
## + Fold3.Rep1: alpha=0.100, lambda=0.04607 
## - Fold3.Rep1: alpha=0.100, lambda=0.04607 
## + Fold3.Rep1: alpha=0.325, lambda=0.04607 
## - Fold3.Rep1: alpha=0.325, lambda=0.04607 
## + Fold3.Rep1: alpha=0.550, lambda=0.04607 
## - Fold3.Rep1: alpha=0.550, lambda=0.04607 
## + Fold3.Rep1: alpha=0.775, lambda=0.04607 
## - Fold3.Rep1: alpha=0.775, lambda=0.04607 
## + Fold3.Rep1: alpha=1.000, lambda=0.04607 
## - Fold3.Rep1: alpha=1.000, lambda=0.04607 
## + Fold1.Rep2: alpha=0.100, lambda=0.04607 
## - Fold1.Rep2: alpha=0.100, lambda=0.04607 
## + Fold1.Rep2: alpha=0.325, lambda=0.04607 
## - Fold1.Rep2: alpha=0.325, lambda=0.04607 
## + Fold1.Rep2: alpha=0.550, lambda=0.04607 
## - Fold1.Rep2: alpha=0.550, lambda=0.04607 
## + Fold1.Rep2: alpha=0.775, lambda=0.04607 
## - Fold1.Rep2: alpha=0.775, lambda=0.04607 
## + Fold1.Rep2: alpha=1.000, lambda=0.04607 
## - Fold1.Rep2: alpha=1.000, lambda=0.04607 
## + Fold2.Rep2: alpha=0.100, lambda=0.04607 
## - Fold2.Rep2: alpha=0.100, lambda=0.04607 
## + Fold2.Rep2: alpha=0.325, lambda=0.04607 
## - Fold2.Rep2: alpha=0.325, lambda=0.04607 
## + Fold2.Rep2: alpha=0.550, lambda=0.04607 
## - Fold2.Rep2: alpha=0.550, lambda=0.04607 
## + Fold2.Rep2: alpha=0.775, lambda=0.04607 
## - Fold2.Rep2: alpha=0.775, lambda=0.04607 
## + Fold2.Rep2: alpha=1.000, lambda=0.04607 
## - Fold2.Rep2: alpha=1.000, lambda=0.04607 
## + Fold3.Rep2: alpha=0.100, lambda=0.04607 
## - Fold3.Rep2: alpha=0.100, lambda=0.04607 
## + Fold3.Rep2: alpha=0.325, lambda=0.04607 
## - Fold3.Rep2: alpha=0.325, lambda=0.04607 
## + Fold3.Rep2: alpha=0.550, lambda=0.04607 
## - Fold3.Rep2: alpha=0.550, lambda=0.04607 
## + Fold3.Rep2: alpha=0.775, lambda=0.04607 
## - Fold3.Rep2: alpha=0.775, lambda=0.04607 
## + Fold3.Rep2: alpha=1.000, lambda=0.04607 
## - Fold3.Rep2: alpha=1.000, lambda=0.04607 
## + Fold1.Rep3: alpha=0.100, lambda=0.04607 
## - Fold1.Rep3: alpha=0.100, lambda=0.04607 
## + Fold1.Rep3: alpha=0.325, lambda=0.04607 
## - Fold1.Rep3: alpha=0.325, lambda=0.04607 
## + Fold1.Rep3: alpha=0.550, lambda=0.04607 
## - Fold1.Rep3: alpha=0.550, lambda=0.04607 
## + Fold1.Rep3: alpha=0.775, lambda=0.04607 
## - Fold1.Rep3: alpha=0.775, lambda=0.04607 
## + Fold1.Rep3: alpha=1.000, lambda=0.04607 
## - Fold1.Rep3: alpha=1.000, lambda=0.04607 
## + Fold2.Rep3: alpha=0.100, lambda=0.04607 
## - Fold2.Rep3: alpha=0.100, lambda=0.04607 
## + Fold2.Rep3: alpha=0.325, lambda=0.04607 
## - Fold2.Rep3: alpha=0.325, lambda=0.04607 
## + Fold2.Rep3: alpha=0.550, lambda=0.04607 
## - Fold2.Rep3: alpha=0.550, lambda=0.04607 
## + Fold2.Rep3: alpha=0.775, lambda=0.04607 
## - Fold2.Rep3: alpha=0.775, lambda=0.04607 
## + Fold2.Rep3: alpha=1.000, lambda=0.04607 
## - Fold2.Rep3: alpha=1.000, lambda=0.04607 
## + Fold3.Rep3: alpha=0.100, lambda=0.04607 
## - Fold3.Rep3: alpha=0.100, lambda=0.04607 
## + Fold3.Rep3: alpha=0.325, lambda=0.04607 
## - Fold3.Rep3: alpha=0.325, lambda=0.04607 
## + Fold3.Rep3: alpha=0.550, lambda=0.04607 
## - Fold3.Rep3: alpha=0.550, lambda=0.04607 
## + Fold3.Rep3: alpha=0.775, lambda=0.04607 
## - Fold3.Rep3: alpha=0.775, lambda=0.04607 
## + Fold3.Rep3: alpha=1.000, lambda=0.04607 
## - Fold3.Rep3: alpha=1.000, lambda=0.04607 
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.03 on full training set
## [1] "myfit_mdl: train complete: 39.656000 secs"

##             Length Class      Mode     
## a0            59   -none-     numeric  
## beta        8201   dgCMatrix  S4       
## df            59   -none-     numeric  
## dim            2   -none-     numeric  
## lambda        59   -none-     numeric  
## dev.ratio     59   -none-     numeric  
## nulldev        1   -none-     numeric  
## npasses        1   -none-     numeric  
## jerr           1   -none-     numeric  
## offset         1   -none-     logical  
## classnames     2   -none-     character
## call           5   -none-     call     
## nobs           1   -none-     numeric  
## lambdaOpt      1   -none-     numeric  
## xNames       139   -none-     character
## problemType    1   -none-     character
## tuneValue      2   data.frame list     
## obsLevels      2   -none-     character
## [1] "min lambda > lambdaOpt:"
##  (Intercept)          PC1          PC2          PC3          PC5 
##  0.320962889  0.000508388 -0.024422219 -0.043633777  0.012116599 
##          PC6          PC7          PC9         PC13         PC14 
##  0.004583478 -0.086228900 -0.049270700  0.063728483 -0.054045306 
##         PC17         PC24         PC25         PC26         PC28 
##  0.013184267 -0.009824787 -0.001993748 -0.001196895 -0.002836108 
##         PC41         PC43         PC68         PC94        PC106 
##  0.023702067  0.024099830  0.054511033 -0.013807616  0.028552423 
##        PC113        PC119        PC120        PC121 
## -0.027906941 -0.025632444  0.003471869 -0.042191527 
## [1] "max lambda < lambdaOpt:"
##   (Intercept)           PC1           PC2           PC3           PC4 
##  0.3221984020  0.0022053824 -0.0266674841 -0.0463798217 -0.0027091039 
##           PC5           PC6           PC7           PC9          PC13 
##  0.0150981908  0.0077876908 -0.0902985958 -0.0531059128  0.0683404754 
##          PC14          PC17          PC24          PC25          PC26 
## -0.0584928023  0.0173872940 -0.0142954364 -0.0064086157 -0.0056908561 
##          PC28          PC34          PC41          PC43          PC63 
## -0.0073801819  0.0032980445  0.0286676924  0.0291004768  0.0024794113 
##          PC68          PC80          PC82          PC94         PC106 
##  0.0603668739 -0.0006394048  0.0004690230 -0.0199222660  0.0352566133 
##         PC113         PC119         PC120         PC121         PC133 
## -0.0349624041 -0.0328150993  0.0106274496 -0.0496255779 -0.0069385262 
## [1] "myfit_mdl: train diagnostics complete: 40.326000 secs"

##          Prediction
## Reference   D   R
##         D 416 413
##         R 305 830
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.344196e-01   2.372535e-01   6.126780e-01   6.557590e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   1.853398e-07   6.518629e-05

##          Prediction
## Reference   D   R
##         D  41 168
##         R  29 257
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.020202e-01   1.040930e-01   5.573855e-01   6.454288e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   1.476045e-01   8.189877e-23 
## [1] "myfit_mdl: predict complete: 50.628000 secs"
##                        id
## 1 All.X#zv.pca#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                      38.86                 0.868
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5751375    0.2436671    0.9066079       0.6629154
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6980656        0.5930076
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1              0.612678              0.655759    0.09455366
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5473868    0.1961722    0.8986014       0.6000268
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                    0.5       0.7229255        0.6020202
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5573855             0.6454288      0.104093
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01145211      0.02385109
## [1] "myfit_mdl: exit: 51.116000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#ica#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst
## = list(id.prefix = bstMdlIdComponents$family, : myfit_mdl: preProcess
## method: range currently does not work for columns with no variance:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## [1] "myfit_mdl: setup complete: 0.846000 secs"
## Warning in preProcess.default(method = "ica", n.comp = 3, x =
## structure(c(-0.480112420809766, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## + Fold1.Rep1: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep1: alpha=0.100, lambda=0.0209 
## + Fold1.Rep1: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep1: alpha=0.325, lambda=0.0209 
## + Fold1.Rep1: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep1: alpha=0.550, lambda=0.0209 
## + Fold1.Rep1: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep1: alpha=0.775, lambda=0.0209 
## + Fold1.Rep1: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep1: alpha=1.000, lambda=0.0209 
## + Fold2.Rep1: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep1: alpha=0.100, lambda=0.0209 
## + Fold2.Rep1: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep1: alpha=0.325, lambda=0.0209 
## + Fold2.Rep1: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep1: alpha=0.550, lambda=0.0209 
## + Fold2.Rep1: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep1: alpha=0.775, lambda=0.0209 
## + Fold2.Rep1: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep1: alpha=1.000, lambda=0.0209 
## + Fold3.Rep1: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep1: alpha=0.100, lambda=0.0209 
## + Fold3.Rep1: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep1: alpha=0.325, lambda=0.0209 
## + Fold3.Rep1: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep1: alpha=0.550, lambda=0.0209 
## + Fold3.Rep1: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep1: alpha=0.775, lambda=0.0209 
## + Fold3.Rep1: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep1: alpha=1.000, lambda=0.0209 
## + Fold1.Rep2: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep2: alpha=0.100, lambda=0.0209 
## + Fold1.Rep2: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep2: alpha=0.325, lambda=0.0209 
## + Fold1.Rep2: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep2: alpha=0.550, lambda=0.0209 
## + Fold1.Rep2: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep2: alpha=0.775, lambda=0.0209 
## + Fold1.Rep2: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep2: alpha=1.000, lambda=0.0209 
## + Fold2.Rep2: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep2: alpha=0.100, lambda=0.0209 
## + Fold2.Rep2: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep2: alpha=0.325, lambda=0.0209 
## + Fold2.Rep2: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep2: alpha=0.550, lambda=0.0209 
## + Fold2.Rep2: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep2: alpha=0.775, lambda=0.0209 
## + Fold2.Rep2: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep2: alpha=1.000, lambda=0.0209 
## + Fold3.Rep2: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep2: alpha=0.100, lambda=0.0209 
## + Fold3.Rep2: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep2: alpha=0.325, lambda=0.0209 
## + Fold3.Rep2: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep2: alpha=0.550, lambda=0.0209 
## + Fold3.Rep2: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep2: alpha=0.775, lambda=0.0209 
## + Fold3.Rep2: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep2: alpha=1.000, lambda=0.0209 
## + Fold1.Rep3: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep3: alpha=0.100, lambda=0.0209 
## + Fold1.Rep3: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep3: alpha=0.325, lambda=0.0209 
## + Fold1.Rep3: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep3: alpha=0.550, lambda=0.0209 
## + Fold1.Rep3: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep3: alpha=0.775, lambda=0.0209 
## + Fold1.Rep3: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold1.Rep3: alpha=1.000, lambda=0.0209 
## + Fold2.Rep3: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep3: alpha=0.100, lambda=0.0209 
## + Fold2.Rep3: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep3: alpha=0.325, lambda=0.0209 
## + Fold2.Rep3: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep3: alpha=0.550, lambda=0.0209 
## + Fold2.Rep3: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep3: alpha=0.775, lambda=0.0209 
## + Fold2.Rep3: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold2.Rep3: alpha=1.000, lambda=0.0209 
## + Fold3.Rep3: alpha=0.100, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep3: alpha=0.100, lambda=0.0209 
## + Fold3.Rep3: alpha=0.325, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep3: alpha=0.325, lambda=0.0209 
## + Fold3.Rep3: alpha=0.550, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep3: alpha=0.550, lambda=0.0209 
## + Fold3.Rep3: alpha=0.775, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep3: alpha=0.775, lambda=0.0209 
## + Fold3.Rep3: alpha=1.000, lambda=0.0209
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## - Fold3.Rep3: alpha=1.000, lambda=0.0209 
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.775, lambda = 0.0209 on full training set
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "ica", n.comp = 3, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff

## [1] "myfit_mdl: train complete: 25.231000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0           35    -none-     numeric  
## beta        105    dgCMatrix  S4       
## df           35    -none-     numeric  
## dim           2    -none-     numeric  
## lambda       35    -none-     numeric  
## dev.ratio    35    -none-     numeric  
## nulldev       1    -none-     numeric  
## npasses       1    -none-     numeric  
## jerr          1    -none-     numeric  
## offset        1    -none-     logical  
## classnames    2    -none-     character
## call          5    -none-     call     
## nobs          1    -none-     numeric  
## lambdaOpt     1    -none-     numeric  
## xNames        3    -none-     character
## problemType   1    -none-     character
## tuneValue     2    data.frame list     
## obsLevels     2    -none-     character
## [1] "min lambda > lambdaOpt:"
## (Intercept)        ICA1        ICA3 
##  0.31594604  0.08593408  0.12497834 
## [1] "max lambda < lambdaOpt:"
## (Intercept)        ICA1        ICA3 
##  0.31616464  0.09239997  0.13152721 
## [1] "myfit_mdl: train diagnostics complete: 25.857000 secs"

##          Prediction
## Reference   D   R
##         D 232 597
##         R 222 913
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.829939e-01   8.973221e-02   5.608226e-01   6.049170e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   3.324889e-01   4.976159e-39

##          Prediction
## Reference   D   R
##         D  10 199
##         R   4 282
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.898990e-01   3.872461e-02   5.451295e-01   6.335883e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   3.090057e-01   3.211368e-42 
## [1] "myfit_mdl: predict complete: 36.354000 secs"
##                     id
## 1 All.X#ica#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              15                       24.3                 0.466
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5061052   0.02895054    0.9832599        0.575577
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6903592        0.5804484
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.5608226              0.604917    0.01548905
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5169304   0.04784689     0.986014       0.5722722
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                    0.5       0.7353325         0.589899
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5451295             0.6335883    0.03872461
##   max.AccuracySD.fit max.KappaSD.fit
## 1        0.005085043      0.01503309
## [1] "myfit_mdl: exit: 36.769000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#spatialSign#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.701000 secs"
## Warning in preProcess.default(method = "spatialSign", x =
## structure(c(-0.480112420809766, : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.325, lambda = 0.0377 on full training set
## Warning in preProcess.default(thresh = 0.95, k = 5, method
## = "spatialSign", : These variables have zero variances:
## Q115611.fctrNo:.clusterid.fctr4, Q115611.fctrYes:.clusterid.fctr4,
## Q115611.fctrNo:.clusterid.fctr5, Q115611.fctrYes:.clusterid.fctr5,
## YOB.Age.fctrNA:YOB.Age.dff

## [1] "myfit_mdl: train complete: 20.201000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             83  -none-     numeric  
## beta        20833  dgCMatrix  S4       
## df             83  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         83  -none-     numeric  
## dev.ratio      83  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##                     (Intercept)                      Edn.fctr^6 
##                    0.3285896652                   -0.1083051493 
##                   Hhold.fctrMKy                   Hhold.fctrPKn 
##                    0.3952570568                   -0.8920595299 
##                   Hhold.fctrPKy                   Hhold.fctrSKy 
##                   -0.5085266490                   -0.2354450049 
##                   Income.fctr^5                  Q100562.fctrNo 
##                    0.1370126084                   -0.2312908858 
##                  Q100689.fctrNo                 Q100689.fctrYes 
##                    0.0001435259                   -0.4477790273 
##                 Q101163.fctrDad                  Q104996.fctrNo 
##                    1.2053679521                    0.4521350729 
##                 Q105655.fctrYes                 Q106042.fctrYes 
##                    0.0625911110                   -0.0879620383 
##                  Q106272.fctrNo                 Q106272.fctrYes 
##                   -0.1876800494                    0.0506854747 
##                 Q106388.fctrYes                  Q106997.fctrGr 
##                    0.2580197286                    0.8931599400 
##                  Q106997.fctrYy                 Q107491.fctrYes 
##                   -0.2398917465                   -0.1205896719 
##                Q108855.fctrYes!                  Q110740.fctrPC 
##                    0.9800935004                    0.6300801933 
##                 Q111220.fctrYes                 Q111848.fctrYes 
##                   -0.1017116998                   -0.4166199020 
##                  Q112512.fctrNo                  Q113181.fctrNo 
##                   -0.0642970006                   -1.6911369705 
##                 Q113181.fctrYes                  Q115611.fctrNo 
##                    0.6522689930                   -1.6345038505 
##                 Q115611.fctrYes                  Q115899.fctrCs 
##                    2.4787579445                   -0.1403464665 
##                  Q115899.fctrMe               Q116881.fctrRight 
##                    0.2393979021                    1.4598633343 
##                  Q116953.fctrNo                 Q119334.fctrYes 
##                    0.3036883428                    0.0849216686 
##              Q119650.fctrGiving             Q120472.fctrScience 
##                    0.2165810364                    0.4346347199 
##                  Q121699.fctrNo                 Q122120.fctrYes 
##                    0.0201735219                    0.8243508234 
##                  Q122771.fctrPt                 Q123621.fctrYes 
##                    0.0659253587                    0.4275148439 
##                  Q124742.fctrNo                  Q98059.fctrYes 
##                   -0.0833654825                   -0.5460025571 
##                   Q98197.fctrNo                   Q98869.fctrNo 
##                   -1.1595064796                   -0.9115034830 
##                   Q99480.fctrNo                  Q99716.fctrYes 
##                   -0.7675816884                   -0.2651929719 
## Q115611.fctrNo:.clusterid.fctr2 Q115611.fctrNA:.clusterid.fctr5 
##                    0.2394614329                    0.1621440523 
## [1] "max lambda < lambdaOpt:"
##                     (Intercept)                      Edn.fctr^6 
##                     0.329665930                    -0.184472250 
##                   Hhold.fctrMKy                   Hhold.fctrPKn 
##                     0.429181534                    -0.960013796 
##                   Hhold.fctrPKy                   Hhold.fctrSKn 
##                    -0.597580760                    -0.053624168 
##                   Hhold.fctrSKy                   Income.fctr.C 
##                    -0.317073552                     0.049504226 
##                   Income.fctr^5                   Income.fctr^6 
##                     0.213257458                    -0.036987829 
##                  Q100562.fctrNo                  Q100689.fctrNo 
##                    -0.289311516                     0.070568813 
##                 Q100689.fctrYes                 Q101163.fctrDad 
##                    -0.489765683                     1.266596632 
##                  Q104996.fctrNo                 Q105655.fctrYes 
##                     0.550009541                     0.127975959 
##                 Q106042.fctrYes                  Q106272.fctrNo 
##                    -0.146327136                    -0.202359894 
##                 Q106272.fctrYes                 Q106388.fctrYes 
##                     0.084650881                     0.295166794 
##                  Q106997.fctrGr                  Q106997.fctrYy 
##                     0.976785034                    -0.277714678 
##                 Q107491.fctrYes                Q108855.fctrYes! 
##                    -0.205039156                     1.054889798 
##                 Q110740.fctrMac                  Q110740.fctrPC 
##                    -0.015452904                     0.690659171 
##                 Q111220.fctrYes                 Q111848.fctrYes 
##                    -0.185833681                    -0.519599732 
##                  Q112512.fctrNo                  Q113181.fctrNo 
##                    -0.137159222                    -1.759340985 
##                 Q113181.fctrYes                  Q114517.fctrNo 
##                     0.619836399                    -0.012318451 
##                  Q115611.fctrNo                 Q115611.fctrYes 
##                    -1.683914757                     2.520722357 
##                  Q115899.fctrCs                  Q115899.fctrMe 
##                    -0.201818877                     0.257334171 
##                  Q116601.fctrNo                 Q116601.fctrYes 
##                    -0.003285330                     0.050710230 
##               Q116881.fctrRight                  Q116953.fctrNo 
##                     1.497064902                     0.361758936 
##      Q117193.fctrStandard hours                 Q119334.fctrYes 
##                     0.040037629                     0.122547997 
##              Q119650.fctrGiving         Q120194.fctrStudy first 
##                     0.299153888                    -0.066822193 
##             Q120472.fctrScience                  Q121699.fctrNo 
##                     0.482006286                     0.092742755 
##                 Q122120.fctrYes                  Q122771.fctrPt 
##                     0.912711660                     0.107102798 
##                 Q123621.fctrYes                 Q124122.fctrYes 
##                     0.499353260                    -0.074964321 
##                  Q124742.fctrNo                  Q98059.fctrYes 
##                    -0.159976200                    -0.652014362 
##                   Q98197.fctrNo                   Q98869.fctrNo 
##                    -1.136301082                    -0.968667764 
##                   Q99480.fctrNo                  Q99716.fctrYes 
##                    -0.808933650                    -0.295858350 
## Q115611.fctrNo:.clusterid.fctr2 Q115611.fctrNA:.clusterid.fctr5 
##                     0.321424169                     0.249894406 
## YOB.Age.fctr(35,40]:YOB.Age.dff YOB.Age.fctr(65,90]:YOB.Age.dff 
##                    -0.024459789                    -0.009170293 
## [1] "myfit_mdl: train diagnostics complete: 20.886000 secs"

##          Prediction
## Reference   D   R
##         D 463 366
##         R 319 816
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.512220e-01   2.795837e-01   6.296765e-01   6.723151e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   1.804262e-11   7.882076e-02

##          Prediction
## Reference   D   R
##         D  34 175
##         R  27 259
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.919192e-01   7.546786e-02   5.471702e-01   6.355636e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.776271e-01   4.507644e-25 
## [1] "myfit_mdl: predict complete: 31.863000 secs"
##                             id
## 1 All.X#spatialSign#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     19.412                  1.95
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.6105095    0.3884198    0.8325991       0.6815483
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.7043591        0.6070928
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6296765             0.6723151     0.1519254
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5405778    0.3014354    0.7797203       0.5713688
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7194444        0.5919192
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5471702             0.6355636    0.07546786
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01350937      0.02893965
## [1] "myfit_mdl: exit: 32.704000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#conditionalX#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.746000 secs"
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.55, lambda = 0.0376 on full training set
## [1] "myfit_mdl: train complete: 15.710000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0             82  -none-     numeric  
## beta        20582  dgCMatrix  S4       
## df             82  -none-     numeric  
## dim             2  -none-     numeric  
## lambda         82  -none-     numeric  
## dev.ratio      82  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        251  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##       (Intercept)     Hhold.fctrMKy     Hhold.fctrPKn   Q101163.fctrDad 
##        0.23798411        0.01874708       -0.13442634        0.09255838 
##    Q106997.fctrGr  Q108855.fctrYes!    Q110740.fctrPC    Q113181.fctrNo 
##        0.04606394        0.05583211        0.02832168       -0.17278645 
##   Q113181.fctrYes    Q115611.fctrNo   Q115611.fctrYes Q116881.fctrRight 
##        0.08695903       -0.13399486        0.36549211        0.17630958 
##   Q122120.fctrYes   Q123621.fctrYes     Q98197.fctrNo     Q98869.fctrNo 
##        0.04163127        0.00229355       -0.14923426       -0.08488000 
##     Q99480.fctrNo 
##       -0.08339722 
## [1] "max lambda < lambdaOpt:"
##         (Intercept)       Hhold.fctrMKy       Hhold.fctrPKn 
##         0.216260115         0.027552836        -0.172608767 
##     Q101163.fctrDad     Q106388.fctrYes      Q106997.fctrGr 
##         0.104315647         0.004549068         0.066207479 
##    Q108855.fctrYes!      Q110740.fctrPC      Q113181.fctrNo 
##         0.071265356         0.042228759        -0.188579678 
##     Q113181.fctrYes      Q115611.fctrNo     Q115611.fctrYes 
##         0.078590932        -0.148973794         0.363232265 
##   Q116881.fctrRight Q120472.fctrScience     Q122120.fctrYes 
##         0.190801458         0.009230062         0.061075685 
##     Q123621.fctrYes       Q98197.fctrNo       Q98869.fctrNo 
##         0.010278853        -0.157484555        -0.099872253 
##       Q99480.fctrNo 
##        -0.095613180 
## [1] "myfit_mdl: train diagnostics complete: 16.415000 secs"

##          Prediction
## Reference   D   R
##         D 429 400
##         R 322 813
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.323829e-01   2.367939e-01   6.106194e-01   6.537503e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   4.843674e-07   4.161629e-03

##          Prediction
## Reference   D   R
##         D  16 193
##         R   8 278
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.939394e-01   5.515512e-02   5.492118e-01   6.375382e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.478775e-01   1.623204e-38 
## [1] "myfit_mdl: predict complete: 26.256000 secs"
##                              id
## 1 All.X#conditionalX#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                     14.873                 1.314
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1       0.5838976     0.318456    0.8493392       0.6603466
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.6925043        0.6098108
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6106194             0.6537503     0.1457828
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5441664    0.2631579    0.8251748       0.5655804
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7344782        0.5939394
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5492118             0.6375382    0.05515512
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01567398      0.03089611
## [1] "myfit_mdl: exit: 26.550000 secs"
## [1] "myfit_mdl: enter: 0.001000 secs"
## [1] "myfit_mdl: fitting model: All.X#zv.pca.spatialSign#rcv#glmnet"
## [1] "    indepVar: Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff"
## [1] "myfit_mdl: setup complete: 0.709000 secs"
## + Fold1.Rep1: alpha=0.100, lambda=0.03768 
## - Fold1.Rep1: alpha=0.100, lambda=0.03768 
## + Fold1.Rep1: alpha=0.325, lambda=0.03768 
## - Fold1.Rep1: alpha=0.325, lambda=0.03768 
## + Fold1.Rep1: alpha=0.550, lambda=0.03768 
## - Fold1.Rep1: alpha=0.550, lambda=0.03768 
## + Fold1.Rep1: alpha=0.775, lambda=0.03768 
## - Fold1.Rep1: alpha=0.775, lambda=0.03768 
## + Fold1.Rep1: alpha=1.000, lambda=0.03768 
## - Fold1.Rep1: alpha=1.000, lambda=0.03768 
## + Fold2.Rep1: alpha=0.100, lambda=0.03768 
## - Fold2.Rep1: alpha=0.100, lambda=0.03768 
## + Fold2.Rep1: alpha=0.325, lambda=0.03768 
## - Fold2.Rep1: alpha=0.325, lambda=0.03768 
## + Fold2.Rep1: alpha=0.550, lambda=0.03768 
## - Fold2.Rep1: alpha=0.550, lambda=0.03768 
## + Fold2.Rep1: alpha=0.775, lambda=0.03768 
## - Fold2.Rep1: alpha=0.775, lambda=0.03768 
## + Fold2.Rep1: alpha=1.000, lambda=0.03768 
## - Fold2.Rep1: alpha=1.000, lambda=0.03768 
## + Fold3.Rep1: alpha=0.100, lambda=0.03768 
## - Fold3.Rep1: alpha=0.100, lambda=0.03768 
## + Fold3.Rep1: alpha=0.325, lambda=0.03768 
## - Fold3.Rep1: alpha=0.325, lambda=0.03768 
## + Fold3.Rep1: alpha=0.550, lambda=0.03768 
## - Fold3.Rep1: alpha=0.550, lambda=0.03768 
## + Fold3.Rep1: alpha=0.775, lambda=0.03768 
## - Fold3.Rep1: alpha=0.775, lambda=0.03768 
## + Fold3.Rep1: alpha=1.000, lambda=0.03768 
## - Fold3.Rep1: alpha=1.000, lambda=0.03768 
## + Fold1.Rep2: alpha=0.100, lambda=0.03768 
## - Fold1.Rep2: alpha=0.100, lambda=0.03768 
## + Fold1.Rep2: alpha=0.325, lambda=0.03768 
## - Fold1.Rep2: alpha=0.325, lambda=0.03768 
## + Fold1.Rep2: alpha=0.550, lambda=0.03768 
## - Fold1.Rep2: alpha=0.550, lambda=0.03768 
## + Fold1.Rep2: alpha=0.775, lambda=0.03768 
## - Fold1.Rep2: alpha=0.775, lambda=0.03768 
## + Fold1.Rep2: alpha=1.000, lambda=0.03768 
## - Fold1.Rep2: alpha=1.000, lambda=0.03768 
## + Fold2.Rep2: alpha=0.100, lambda=0.03768 
## - Fold2.Rep2: alpha=0.100, lambda=0.03768 
## + Fold2.Rep2: alpha=0.325, lambda=0.03768 
## - Fold2.Rep2: alpha=0.325, lambda=0.03768 
## + Fold2.Rep2: alpha=0.550, lambda=0.03768 
## - Fold2.Rep2: alpha=0.550, lambda=0.03768 
## + Fold2.Rep2: alpha=0.775, lambda=0.03768 
## - Fold2.Rep2: alpha=0.775, lambda=0.03768 
## + Fold2.Rep2: alpha=1.000, lambda=0.03768 
## - Fold2.Rep2: alpha=1.000, lambda=0.03768 
## + Fold3.Rep2: alpha=0.100, lambda=0.03768 
## - Fold3.Rep2: alpha=0.100, lambda=0.03768 
## + Fold3.Rep2: alpha=0.325, lambda=0.03768 
## - Fold3.Rep2: alpha=0.325, lambda=0.03768 
## + Fold3.Rep2: alpha=0.550, lambda=0.03768 
## - Fold3.Rep2: alpha=0.550, lambda=0.03768 
## + Fold3.Rep2: alpha=0.775, lambda=0.03768 
## - Fold3.Rep2: alpha=0.775, lambda=0.03768 
## + Fold3.Rep2: alpha=1.000, lambda=0.03768 
## - Fold3.Rep2: alpha=1.000, lambda=0.03768 
## + Fold1.Rep3: alpha=0.100, lambda=0.03768 
## - Fold1.Rep3: alpha=0.100, lambda=0.03768 
## + Fold1.Rep3: alpha=0.325, lambda=0.03768 
## - Fold1.Rep3: alpha=0.325, lambda=0.03768 
## + Fold1.Rep3: alpha=0.550, lambda=0.03768 
## - Fold1.Rep3: alpha=0.550, lambda=0.03768 
## + Fold1.Rep3: alpha=0.775, lambda=0.03768 
## - Fold1.Rep3: alpha=0.775, lambda=0.03768 
## + Fold1.Rep3: alpha=1.000, lambda=0.03768 
## - Fold1.Rep3: alpha=1.000, lambda=0.03768 
## + Fold2.Rep3: alpha=0.100, lambda=0.03768 
## - Fold2.Rep3: alpha=0.100, lambda=0.03768 
## + Fold2.Rep3: alpha=0.325, lambda=0.03768 
## - Fold2.Rep3: alpha=0.325, lambda=0.03768 
## + Fold2.Rep3: alpha=0.550, lambda=0.03768 
## - Fold2.Rep3: alpha=0.550, lambda=0.03768 
## + Fold2.Rep3: alpha=0.775, lambda=0.03768 
## - Fold2.Rep3: alpha=0.775, lambda=0.03768 
## + Fold2.Rep3: alpha=1.000, lambda=0.03768 
## - Fold2.Rep3: alpha=1.000, lambda=0.03768 
## + Fold3.Rep3: alpha=0.100, lambda=0.03768 
## - Fold3.Rep3: alpha=0.100, lambda=0.03768 
## + Fold3.Rep3: alpha=0.325, lambda=0.03768 
## - Fold3.Rep3: alpha=0.325, lambda=0.03768 
## + Fold3.Rep3: alpha=0.550, lambda=0.03768 
## - Fold3.Rep3: alpha=0.550, lambda=0.03768 
## + Fold3.Rep3: alpha=0.775, lambda=0.03768 
## - Fold3.Rep3: alpha=0.775, lambda=0.03768 
## + Fold3.Rep3: alpha=1.000, lambda=0.03768 
## - Fold3.Rep3: alpha=1.000, lambda=0.03768 
## Aggregating results
## Selecting tuning parameters
## Fitting alpha = 0.325, lambda = 0.0377 on full training set
## [1] "myfit_mdl: train complete: 591.356000 secs"
## Warning in myfit_mdl(mdl_specs_lst = myinit_mdl_specs_lst(mdl_specs_lst =
## list(id.prefix = bstMdlIdComponents$family, : model's bestTune found at an
## extreme of tuneGrid for parameter: lambda

##             Length Class      Mode     
## a0            100  -none-     numeric  
## beta        38500  dgCMatrix  S4       
## df            100  -none-     numeric  
## dim             2  -none-     numeric  
## lambda        100  -none-     numeric  
## dev.ratio     100  -none-     numeric  
## nulldev         1  -none-     numeric  
## npasses         1  -none-     numeric  
## jerr            1  -none-     numeric  
## offset          1  -none-     logical  
## classnames      2  -none-     character
## call            5  -none-     call     
## nobs            1  -none-     numeric  
## lambdaOpt       1  -none-     numeric  
## xNames        385  -none-     character
## problemType     1  -none-     character
## tuneValue       2  data.frame list     
## obsLevels       2  -none-     character
## [1] "min lambda > lambdaOpt:"
##                     (Intercept)                      Edn.fctr^6 
##                     0.329991765                    -0.052919874 
##                   Hhold.fctrMKy                   Hhold.fctrPKn 
##                     0.407893163                    -0.494920989 
##                   Hhold.fctrPKy                   Hhold.fctrSKn 
##                    -0.262124055                    -0.148876655 
##                  Q100562.fctrNo                  Q100689.fctrNo 
##                    -0.424057583                     0.003983168 
##                 Q100689.fctrYes                 Q101163.fctrDad 
##                    -0.356400700                     1.069891386 
##                  Q104996.fctrNo                 Q106042.fctrYes 
##                     0.271864079                    -0.149781433 
##                  Q106272.fctrNo                 Q106388.fctrYes 
##                    -0.208271429                     0.305149471 
##                  Q106997.fctrGr                  Q106997.fctrYy 
##                     0.694613791                    -0.250527691 
##                 Q107491.fctrYes                Q108855.fctrYes! 
##                    -0.009759907                     0.710454016 
##                 Q110740.fctrMac                  Q110740.fctrPC 
##                    -0.032018155                     0.464848180 
##                 Q111220.fctrYes                 Q111848.fctrYes 
##                    -0.064630216                    -0.095160073 
##                  Q113181.fctrNo                 Q113181.fctrYes 
##                    -1.522244959                     0.676586996 
##                  Q115611.fctrNo                 Q115611.fctrYes 
##                    -1.413046501                     2.369378010 
##                  Q115899.fctrCs                  Q115899.fctrMe 
##                    -0.055117406                     0.140114962 
##               Q116881.fctrRight                  Q116953.fctrNo 
##                     1.403024979                     0.137827982 
##      Q117193.fctrStandard hours                 Q119334.fctrYes 
##                     0.090827982                     0.080953399 
##              Q119650.fctrGiving             Q120472.fctrScience 
##                     0.054937481                     0.271856261 
##                 Q122120.fctrYes                 Q123621.fctrYes 
##                     0.605512533                     0.377268243 
##                  Q98059.fctrYes                   Q98197.fctrNo 
##                    -0.345475132                    -1.054566193 
##                   Q98869.fctrNo                   Q99480.fctrNo 
##                    -0.786995824                    -0.737861700 
##                  Q99716.fctrYes Q115611.fctrNo:.clusterid.fctr2 
##                    -0.085849329                     0.084813863 
## YOB.Age.fctr(40,50]:YOB.Age.dff                             PC3 
##                     0.001897725                    -0.004662543 
##                             PC7                             PC9 
##                    -0.019589189                    -0.001243451 
##                            PC13                            PC14 
##                     0.010210744                    -0.020008412 
##                            PC17                            PC24 
##                     0.002954924                    -0.001727598 
##                            PC26                            PC41 
##                    -0.008565914                     0.013830991 
##                            PC43                            PC50 
##                     0.014920820                    -0.006278934 
##                            PC52                            PC56 
##                    -0.006270187                    -0.009209784 
##                            PC68                            PC82 
##                     0.044920443                     0.001240941 
##                            PC90                            PC91 
##                    -0.002693600                    -0.004491710 
##                            PC94                            PC97 
##                    -0.022397273                    -0.008558782 
##                           PC106                           PC113 
##                     0.037455415                    -0.041304970 
##                           PC119                           PC120 
##                    -0.024838795                     0.023447012 
##                           PC121                           PC131 
##                    -0.061150797                    -0.007979888 
##                           PC132                           PC133 
##                    -0.001010211                    -0.028441375 
## [1] "max lambda < lambdaOpt:"
##                     (Intercept)                      Edn.fctr^6 
##                    0.3314268864                   -0.1478894944 
##                   Hhold.fctrMKy                   Hhold.fctrPKn 
##                    0.4319963050                   -0.5122106403 
##                   Hhold.fctrPKy                   Hhold.fctrSKn 
##                   -0.2805944855                   -0.2182106427 
##                   Income.fctr^6                  Q100562.fctrNo 
##                   -0.0060717225                   -0.4791515059 
##                  Q100689.fctrNo                 Q100689.fctrYes 
##                    0.0604714353                   -0.4049740314 
##                 Q101163.fctrDad                  Q104996.fctrNo 
##                    1.1299829953                    0.3555893748 
##                 Q104996.fctrYes                 Q105655.fctrYes 
##                   -0.0047951607                    0.0314608980 
##                 Q106042.fctrYes                  Q106272.fctrNo 
##                   -0.2198262377                   -0.2418399889 
##                 Q106388.fctrYes                  Q106997.fctrGr 
##                    0.3571553662                    0.7443744524 
##                  Q106997.fctrYy                 Q107491.fctrYes 
##                   -0.2799701232                   -0.0677995130 
##                Q108855.fctrYes!                 Q110740.fctrMac 
##                    0.7478598853                   -0.0715001718 
##                  Q110740.fctrPC                 Q111220.fctrYes 
##                    0.5070330383                   -0.1238869380 
##                 Q111848.fctrYes                  Q113181.fctrNo 
##                   -0.1765378901                   -1.5553735772 
##                 Q113181.fctrYes                  Q115611.fctrNo 
##                    0.6719922874                   -1.4478802162 
##                 Q115611.fctrYes                  Q115899.fctrCs 
##                    2.4135494125                   -0.1040042897 
##                  Q115899.fctrMe               Q116881.fctrRight 
##                    0.1350056999                    1.4309445669 
##                  Q116953.fctrNo      Q117193.fctrStandard hours 
##                    0.2132673290                    0.1581146461 
##                 Q119334.fctrYes              Q119650.fctrGiving 
##                    0.1076022270                    0.1030371035 
##             Q120472.fctrScience                 Q122120.fctrYes 
##                    0.2914142541                    0.6530497413 
##                 Q123621.fctrYes                  Q98059.fctrYes 
##                    0.3944701882                   -0.4264209459 
##                   Q98197.fctrNo                   Q98869.fctrNo 
##                   -1.0255706012                   -0.8123131119 
##                   Q99480.fctrNo                  Q99716.fctrYes 
##                   -0.7708797212                   -0.0960683143 
## Q115611.fctrNo:.clusterid.fctr2 Q115611.fctrNA:.clusterid.fctr5 
##                    0.1591960411                    0.0238766175 
## YOB.Age.fctr(40,50]:YOB.Age.dff                             PC3 
##                    0.0660827303                   -0.0056148487 
##                             PC7                             PC9 
##                   -0.0201719445                   -0.0024314081 
##                            PC13                            PC14 
##                    0.0117636327                   -0.0224839484 
##                            PC17                            PC24 
##                    0.0042120325                   -0.0025782874 
##                            PC26                            PC31 
##                   -0.0111706115                   -0.0004018937 
##                            PC41                            PC43 
##                    0.0136460272                    0.0171298367 
##                            PC44                            PC50 
##                    0.0001214101                   -0.0093947928 
##                            PC52                            PC56 
##                   -0.0076113106                   -0.0115042050 
##                            PC68                            PC82 
##                    0.0465172309                    0.0049964697 
##                            PC90                            PC91 
##                   -0.0066533657                   -0.0079880386 
##                            PC94                            PC97 
##                   -0.0249234875                   -0.0114614175 
##                           PC106                           PC109 
##                    0.0405008558                    0.0005246326 
##                           PC113                           PC119 
##                   -0.0461143941                   -0.0296316491 
##                           PC120                           PC121 
##                    0.0286564064                   -0.0672517909 
##                           PC131                           PC132 
##                   -0.0128706092                   -0.0067850811 
##                           PC133 
##                   -0.0330433143 
## [1] "myfit_mdl: train diagnostics complete: 593.182000 secs"

##          Prediction
## Reference   D   R
##         D 471 358
##         R 323 812
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   6.532587e-01   2.851960e-01   6.317388e-01   6.743199e-01   5.779022e-01 
## AccuracyPValue  McnemarPValue 
##   5.012357e-12   1.926148e-01

##          Prediction
## Reference   D   R
##         D  37 172
##         R  28 258
##       Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
##   5.959596e-01   8.722110e-02   5.512541e-01   6.395120e-01   5.777778e-01 
## AccuracyPValue  McnemarPValue 
##   2.199066e-01   4.906264e-24 
## [1] "myfit_mdl: predict complete: 604.315000 secs"
##                                    id
## 1 All.X#zv.pca.spatialSign#rcv#glmnet
##                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             feats
## 1 Q115611.fctr,Q113181.fctr,Q98197.fctr,Q116881.fctr,Q108855.fctr,Q106272.fctr,Q122771.fctr,Q123621.fctr,Q106388.fctr,Q110740.fctr,Q122769.fctr,Q120472.fctr,Q101596.fctr,Q119334.fctr,Q114152.fctr,Q98869.fctr,Q115899.fctr,Q116797.fctr,Q118232.fctr,Gender.fctr,Q105655.fctr,Q99480.fctr,Q123464.fctr,Q120650.fctr,Q122120.fctr,Q107869.fctr,Q120014.fctr,Q102289.fctr,Income.fctr,Q122770.fctr,Q111580.fctr,Q116601.fctr,Q117186.fctr,Q106993.fctr,Q112270.fctr,Q101162.fctr,Q108856.fctr,Q117193.fctr,Q116441.fctr,Q119851.fctr,Q111848.fctr,Q98578.fctr,Q118892.fctr,Q114386.fctr,Q120978.fctr,Q112512.fctr,Q102674.fctr,Q96024.fctr,Q108950.fctr,Q115610.fctr,YOB.Age.fctr,Q112478.fctr,Q116197.fctr,Q124742.fctr,Q106389.fctr,Edn.fctr,Q118117.fctr,Q100562.fctr,Q107491.fctr,Q116448.fctr,Q108754.fctr,Q116953.fctr,Q115602.fctr,Q118233.fctr,Q120012.fctr,Q118237.fctr,Q99581.fctr,.rnorm,Q120194.fctr,Q115777.fctr,Q106997.fctr,Q100680.fctr,Q113584.fctr,Q108343.fctr,Q121700.fctr,Q105840.fctr,Q120379.fctr,Q103293.fctr,Q124122.fctr,Q109367.fctr,Q113992.fctr,Q121699.fctr,Q121011.fctr,Q114748.fctr,Q106042.fctr,Q111220.fctr,Q114517.fctr,Q102687.fctr,Q102906.fctr,Q98078.fctr,Q115390.fctr,Q102089.fctr,Q100010.fctr,Q99982.fctr,Q113583.fctr,Q108342.fctr,Q104996.fctr,Q119650.fctr,Q100689.fctr,Q108617.fctr,Q115195.fctr,Q99716.fctr,Q101163.fctr,Q98059.fctr,Q114961.fctr,Hhold.fctr,Q115611.fctr:.clusterid.fctr,YOB.Age.fctr:YOB.Age.dff
##   max.nTuningRuns min.elapsedtime.everything min.elapsedtime.final
## 1              25                    590.561                15.786
##   max.AUCpROC.fit max.Sens.fit max.Spec.fit max.AUCROCR.fit
## 1        0.608978    0.3835947    0.8343612       0.6868814
##   opt.prob.threshold.fit max.f.score.fit max.Accuracy.fit
## 1                   0.55       0.7045553        0.6064166
##   max.AccuracyLower.fit max.AccuracyUpper.fit max.Kappa.fit
## 1             0.6317388             0.6743199      0.154141
##   max.AUCpROC.OOB max.Sens.OOB max.Spec.OOB max.AUCROCR.OOB
## 1       0.5403938    0.2870813    0.7937063       0.5850035
##   opt.prob.threshold.OOB max.f.score.OOB max.Accuracy.OOB
## 1                   0.45       0.7206704        0.5959596
##   max.AccuracyLower.OOB max.AccuracyUpper.OOB max.Kappa.OOB
## 1             0.5512541              0.639512     0.0872211
##   max.AccuracySD.fit max.KappaSD.fit
## 1         0.01534983      0.03191194
## [1] "myfit_mdl: exit: 605.264000 secs"
##                                     min.elapsedtime.everything
## Random###myrandom_classfr                                0.280
## MFO###myMFO_classfr                                      0.357
## Max.cor.Y.rcv.1X1###glmnet                               0.689
## Max.cor.Y##rcv#rpart                                     1.391
## All.X##rcv#glmnet                                       13.709
## Low.cor.X##rcv#glmnet                                   14.267
## All.X#conditionalX#rcv#glmnet                           14.873
## All.X#zv#rcv#glmnet                                     14.994
## All.X#center.scale#rcv#glmnet                           16.713
## All.X#range#rcv#glmnet                                  17.036
## All.X#scale#rcv#glmnet                                  17.103
## All.X#center#rcv#glmnet                                 17.758
## All.X#BoxCox#rcv#glmnet                                 19.300
## All.X#spatialSign#rcv#glmnet                            19.412
## All.X#nzv#rcv#glmnet                                    19.728
## All.X#ica#rcv#glmnet                                    24.300
## All.X#zv.pca#rcv#glmnet                                 38.860
## All.X#YeoJohnson#rcv#glmnet                             53.786
## All.X#expoTrans#rcv#glmnet                              55.054
## All.X#zv.pca.spatialSign#rcv#glmnet                    590.561
##                  label step_major step_minor label_minor      bgn      end
## 4 fit.models_1_preProc          1          3     preProc  195.123 1287.616
## 5     fit.models_1_end          1          4    teardown 1287.617       NA
##    elapsed
## 4 1092.493
## 5       NA
##         label step_major step_minor label_minor      bgn      end elapsed
## 15 fit.models          7          1           1  165.577 1287.627 1122.05
## 16 fit.models          7          2           2 1287.628       NA      NA

```{r fit.models_2, cache=FALSE, fig.height=10, fig.width=15, eval=myevlChunk(glbChunks, glbOut$pfx)}

# if (sum(is.na(glbObsAll$D.P.http)) > 0)
#         stop("fit.models_3: Why is this happening ?")

#stop(here"); glb2Sav()
sync_glb_obs_df <- function() {
    # Merge or cbind ?
    for (col in setdiff(names(glbObsFit), names(glbObsTrn)))
        glbObsTrn[glbObsTrn$.lcn == "Fit", col] <<- glbObsFit[, col]
    for (col in setdiff(names(glbObsFit), names(glbObsAll)))
        glbObsAll[glbObsAll$.lcn == "Fit", col] <<- glbObsFit[, col]
    if (all(is.na(glbObsNew[, glb_rsp_var])))
        for (col in setdiff(names(glbObsOOB), names(glbObsTrn)))
            glbObsTrn[glbObsTrn$.lcn == "OOB", col] <<- glbObsOOB[, col]
    for (col in setdiff(names(glbObsOOB), names(glbObsAll)))
        glbObsAll[glbObsAll$.lcn == "OOB", col] <<- glbObsOOB[, col]
}
sync_glb_obs_df()
    
print(setdiff(names(glbObsNew), names(glbObsAll)))

replay.petrisim(pn = glb_analytics_pn, 
    replay.trans = (glb_analytics_avl_objs <- c(glb_analytics_avl_objs, 
        "model.selected")), flip_coord = TRUE)
glb_chunks_df <- myadd_chunk(glb_chunks_df, "fit.data.training", major.inc = TRUE)

Step 7.2: fit models

```{r fit.data.training_0, cache=FALSE, eval=myevlChunk(glbChunks, glbOut$pfx)}

#stop(here"); glb2Sav()
if (glb_is_classification && glb_is_binomial) 
    prob_threshold <- glb_models_df[glb_models_df$id == glbMdlSelId,
                                        "opt.prob.threshold.OOB"] else 
    prob_threshold <- NULL

if (grepl("Ensemble", glbMdlFinId)) {
    # Get predictions for each model in ensemble; Outliers that have been moved to OOB might not have been predicted yet
    mdlEnsembleComps <- unlist(str_split(subset(glb_models_df, 
                                                id == glbMdlFinId)$feats, ","))
    if (glb_is_classification)
    #     mdlEnsembleComps <- gsub("\\.prob$", "", mdlEnsembleComps)
    # mdlEnsembleComps <- gsub(paste0("^", 
    #                     gsub(".", "\\.", mygetPredictIds(glb_rsp_var)$value, fixed = TRUE)),
    #                          "", mdlEnsembleComps)
        mdlEnsembleComps <- glb_models_df$id[sapply(glb_models_df$id, function(thsMdlId)
                        mygetPredictIds(glb_rsp_var, thsMdlId)$prob  %in% mdlEnsembleComps)] else
        mdlEnsembleComps <- glb_models_df$id[sapply(glb_models_df$id, function(thsMdlId)
                        mygetPredictIds(glb_rsp_var, thsMdlId)$value  %in% mdlEnsembleComps)]
                        
    for (mdl_id in mdlEnsembleComps) {
        glbObsTrn <- glb_get_predictions(df = glbObsTrn, mdl_id = mdl_id, 
                                            rsp_var = glb_rsp_var,
                                            prob_threshold_def = prob_threshold)
        glbObsNew <- glb_get_predictions(df = glbObsNew, mdl_id = mdl_id, 
                                            rsp_var = glb_rsp_var,
                                            prob_threshold_def = prob_threshold)
        # glb_fin_mdl uses the same coefficients as glb_sel_mdl, 
        #   so copy the "Final" columns into "non-Final" columns
        glbObsTrn[, gsub("Final.", "", unlist(mygetPredictIds(glb_rsp_var, mdl_id)))] <-
            glbObsTrn[, unlist(mygetPredictIds(glb_rsp_var, mdl_id))]
        glbObsNew[, gsub("Final.", "", unlist(mygetPredictIds(glb_rsp_var, mdl_id)))] <-
            glbObsNew[, unlist(mygetPredictIds(glb_rsp_var, mdl_id))]
    }    
}
glbObsTrn <- glb_get_predictions(df = glbObsTrn, mdl_id = glbMdlFinId, 
                                     rsp_var = glb_rsp_var,
                                    prob_threshold_def = prob_threshold)

glb_featsimp_df <- myget_feats_importance(mdl=glb_fin_mdl,
                                          featsimp_df=glb_featsimp_df)
#glb_featsimp_df[, paste0(glbMdlFinId, ".imp")] <- glb_featsimp_df$imp
print(glb_featsimp_df)
if (glb_is_classification && glb_is_binomial)
    glb_analytics_diag_plots(obs_df=glbObsTrn, mdl_id=glbMdlFinId, 
            prob_threshold=glb_models_df[glb_models_df$id == glbMdlSelId, 
                                         "opt.prob.threshold.OOB"]) else
    glb_analytics_diag_plots(obs_df=glbObsTrn, mdl_id=glbMdlFinId)                  

dsp_feats_vctr <- c(NULL)
for(var in grep(".imp", names(glb_feats_df), fixed=TRUE, value=TRUE))
    dsp_feats_vctr <- union(dsp_feats_vctr, 
                            glb_feats_df[!is.na(glb_feats_df[, var]), "id"])

# print(glbObsTrn[glbObsTrn$UniqueID %in% FN_OOB_ids, 
#                     grep(glb_rsp_var, names(glbObsTrn), value=TRUE)])

print(setdiff(names(glbObsTrn), names(glbObsAll)))
for (col in setdiff(names(glbObsTrn), names(glbObsAll)))
    # Merge or cbind ?
    glbObsAll[glbObsAll$.src == "Train", col] <- glbObsTrn[, col]

print(setdiff(names(glbObsFit), names(glbObsAll)))
print(setdiff(names(glbObsOOB), names(glbObsAll)))
for (col in setdiff(names(glbObsOOB), names(glbObsAll)))
    # Merge or cbind ?
    glbObsAll[glbObsAll$.lcn == "OOB", col] <- glbObsOOB[, col]
    
print(setdiff(names(glbObsNew), names(glbObsAll)))

#glb2Sav(); all.equal(savObsAll, glbObsAll); all.equal(sav_models_lst, glb_models_lst)
#load(file = paste0(glbOut$pfx, "dsk_knitr.RData"))
#cmpCols <- names(glbObsAll)[!grepl("\\.Final\\.", names(glbObsAll))]; all.equal(savObsAll[, cmpCols], glbObsAll[, cmpCols]); all.equal(savObsAll[, "H.P.http"], glbObsAll[, "H.P.http"]); 

replay.petrisim(pn = glb_analytics_pn, 
    replay.trans = (glb_analytics_avl_objs <- c(glb_analytics_avl_objs, 
        "data.training.all.prediction","model.final")), flip_coord = TRUE)
glb_chunks_df <- myadd_chunk(glb_chunks_df, "predict.data.new", major.inc = TRUE)

Step 7.2: fit models

Null Hypothesis (\(\sf{H_{0}}\)): mpg is not impacted by am_fctr.
The variance by am_fctr appears to be independent. #{r q1, cache=FALSE} # print(t.test(subset(cars_df, am_fctr == "automatic")$mpg, # subset(cars_df, am_fctr == "manual")$mpg, # var.equal=FALSE)$conf) # We reject the null hypothesis i.e. we have evidence to conclude that am_fctr impacts mpg (95% confidence). Manual transmission is better for miles per gallon versus automatic transmission.

##                        label step_major step_minor label_minor     bgn
## 15                fit.models          7          1           1 165.577
## 14                fit.models          7          0           0 108.107
## 12   partition.data.training          5          0           0  55.627
## 11              cluster.data          4          0           0   8.543
## 13           select.features          6          0           0 105.202
## 1                 scrub.data          1          0           0   6.165
## 9       extract.features.end          2          6           6   7.359
## 10       manage.missing.data          3          0           0   7.987
## 8    extract.features.string          2          5           5   7.310
## 7      extract.features.text          2          4           4   7.265
## 5     extract.features.image          2          2           2   7.198
## 2             transform.data          1          1           1   7.116
## 4  extract.features.datetime          2          1           1   7.168
## 6     extract.features.price          2          3           3   7.239
## 3           extract.features          2          0           0   7.152
##         end  elapsed duration
## 15 1287.627 1122.050 1122.050
## 14  165.576   57.469   57.469
## 12  105.201   49.574   49.574
## 11   55.627   47.084   47.084
## 13  108.106    2.904    2.904
## 1     7.115    0.950    0.950
## 9     7.987    0.628    0.628
## 10    8.543    0.556    0.556
## 8     7.359    0.049    0.049
## 7     7.310    0.045    0.045
## 5     7.239    0.041    0.041
## 2     7.151    0.035    0.035
## 4     7.198    0.030    0.030
## 6     7.265    0.026    0.026
## 3     7.167    0.015    0.015
## [1] "Total Elapsed Time: 1,287.627 secs"

##                  label step_major step_minor label_minor     bgn      end
## 4 fit.models_1_preProc          1          3     preProc 195.123 1287.616
## 3   fit.models_1_All.X          1          2      glmnet 169.538  195.123
## 1     fit.models_1_bgn          1          0       setup 169.516  169.529
## 2   fit.models_1_All.X          1          1       setup 169.530  169.538
##    elapsed duration
## 4 1092.493 1092.493
## 3   25.585   25.585
## 1    0.013    0.013
## 2    0.008    0.008
## [1] "Total Elapsed Time: 1,287.616 secs"